|
1 | 1 | #node-postgres |
2 | 2 |
|
3 | | -100% javascript. 100% async. 100% would love your contributions. |
| 3 | +Non-blocking (async) JavaScript PostgreSQL client for node.js written fully TDD |
4 | 4 |
|
5 | | -## ALPHA version |
| 5 | +## alpha version |
6 | 6 |
|
7 | | -Implemented in a fully TDD fashion. I'm aiming for |
8 | | -extremely high quality code, but first doing the implementation and |
9 | | -only refactoring after tests are in place. |
| 7 | +### Whirlwind tour |
10 | 8 |
|
11 | | -##### Installation |
12 | | - |
13 | | -Clone the repo. There are __no__ dependencies. |
14 | | - |
15 | | - |
16 | | - git clone git://github.com/brianc/node-postgres |
17 | | - cd node-postgres |
18 | | - node test/run.js |
19 | | - |
20 | | -And just like magic, you're ready to contribute! <3 |
21 | | - |
22 | | -I don't have _style guidelines_ or anything right now. I'm 100x more |
23 | | -concerned with test coverage, functionality, and happy coding than I |
24 | | -am about whether or not you've got the proper spacing after your `{ hash: 'separators' }` |
25 | | - |
26 | | -### Connection |
27 | | - |
28 | | -The connection object is a 1 to 1 mapping to the [postgres |
29 | | -client/server messaging protocol](http://developer.postgresql.org/pgdocs/postgres/protocol.html). |
30 | | -The __Connection_ object is mostly used by the Client object (which...I haven't yet |
31 | | -finished implementing) but you can do anything you want with PostgreSQL using |
32 | | -the connection object if you're really into that. I studied the |
33 | | -protocol for a while implementing this and the documentation is pretty |
34 | | -solid. If you're already familiar you should be right at home. Have |
35 | | -fun looking up the [oids for the datatypes in your bound queries](http://github.com/brianc/node-postgres/blob/master/script/list-db-types.js) |
36 | | - |
37 | | -There are a few minor variations from the protocol: |
38 | | - |
39 | | -- The connection only supports 'text' mode right now. |
40 | | -- Renamed 'passwordMessage' to 'password' |
41 | | -- Renamed 'startupMessage' to 'startup' |
42 | | -- Renamed 'errorResposne' to 'error' |
43 | | -- Renamed 'noticeResponce' to 'notice' |
44 | | - |
45 | | -The reason for the renamings is 90% of the message names in the |
46 | | -protocol do no contain "message" "request" "response" or anything |
47 | | -similar, and I feel it's a bit redundant to send a "passwordMessage |
48 | | -message." But then again...[I do say ATM machine](http://en.wikipedia.org/wiki/RAS_syndrome). |
49 | | - |
50 | | -Anyways...using a connection directly is a pretty verbose and |
51 | | -cumbersom affair. Here's an example of executing a prepared query |
52 | | -using the directly __Connection__ api in compliance with |
53 | | -PostgreSQL. |
54 | | - |
55 | | -_note: this works and is taken directly from an integration test; |
56 | | -however, it doesn't include error handling_ |
57 | | - |
58 | | - var con = new Connection({stream: new net.Stream()}); |
59 | | - |
60 | | - con.connect('5432','localhost'); |
61 | | - |
62 | | - con.once('connect', function() { |
63 | | - |
64 | | - con.startup({ |
65 | | - user: username, |
66 | | - database: database |
67 | | - }); |
68 | | - |
69 | | - con.once('readyForQuery', function() { |
70 | | - |
71 | | - con.query('create temp table ids(id integer)'); |
72 | | - |
73 | | - con.once('readyForQuery', function() { |
74 | | - |
75 | | - con.query('insert into ids(id) values(1); insert into ids(id) values(2);'); |
76 | | - |
77 | | - con.once('readyForQuery', function() { |
78 | | - |
79 | | - con.parse({ |
80 | | - text: 'select * from ids' |
81 | | - }); |
82 | | - con.flush(); |
83 | | - |
84 | | - con.once('parseComplete', function() { |
85 | | - con.bind(); |
86 | | - con.flush(); |
87 | | - }); |
88 | | - |
89 | | - con.once('bindComplete', function() { |
90 | | - con.execute(); |
91 | | - con.flush(); |
92 | | - }); |
93 | | - |
94 | | - con.once('commandComplete', function() { |
95 | | - con.sync(); |
96 | | - }); |
97 | | - |
98 | | - con.once('readyForQuery', function() { |
99 | | - con.end(); |
100 | | - }); |
101 | | - }); |
102 | | - }); |
103 | | - }); |
| 9 | + var Client = require('node-postgres').Client; |
| 10 | + var client = new Client({ |
| 11 | + user: 'brianc', |
| 12 | + database: 'test', |
| 13 | + password: 'boom' //plaintext or md5 supported |
104 | 14 | }); |
105 | 15 |
|
| 16 | + client.connect(); |
106 | 17 |
|
107 | | -### Client |
108 | | - |
109 | | -Basically a facade on top of the connection to provide a _much_ more |
110 | | -user friendly, "node style" interface for doing all the lovely things |
111 | | -you like with PostgreSQL. |
| 18 | + var printRow = function(row) { |
| 19 | + console.log(row.fields); |
| 20 | + }; |
112 | 21 |
|
113 | | -Now that I've got the __Connection__ api in place, the bulk and meat of |
114 | | -the work is being done on the __Client__ to provide the best possible |
115 | | -API. Help? Yes please! |
| 22 | + var simpleQuery = client.query("select * from user where heart = 'big'"); |
| 23 | + simpleQuery.on('row', printRow); |
116 | 24 |
|
117 | | - var client = new Client({ |
118 | | - user: 'brian', |
119 | | - database: 'postgres', |
| 25 | + var preparedStatement = client.query({ |
| 26 | + name: 'user by heart type', |
| 27 | + text: 'select * from user where heart = $1', |
| 28 | + values: ['big'] |
120 | 29 | }); |
| 30 | + preparedStatement.on('row', printRow); |
121 | 31 |
|
122 | | - client.query("create temp table ids(id integer)"); |
123 | | - client.query("insert into ids(id) values(1)"); |
124 | | - client.query("insert into ids(id) values(2)"); |
125 | | - var query = client.query("select * from ids", function(row) { |
126 | | - row.fields[0] // <- that equals 1 the first time. 2 the second time. |
| 32 | + var cachedPreparedStatement = client.query({ |
| 33 | + name: 'user by heart type', |
| 34 | + values: ['filled with kisses'] |
127 | 35 | }); |
128 | | - query.on('end', function() { |
129 | | - client.end(); |
130 | | - }); |
131 | | - |
132 | | -#### Prepared statements |
133 | | - |
134 | | -I'm still working on the API for prepared statements. Check out the |
135 | | -tests for more up to date examples, but what I'm working towards is |
136 | | -something like this: |
137 | | - |
138 | | - |
139 | | - var client = new Client({ |
140 | | - user: 'brian', |
141 | | - database: 'test' |
142 | | - }); |
143 | | - |
144 | | - var query = client.query({ |
145 | | - text: 'select * from person where age < $1', |
146 | | - values: [21] |
147 | | - }); |
148 | | - |
149 | | - query.on('row', function(row) { |
150 | | - console.log(row); |
151 | | - }); |
152 | | - |
153 | | - query.on('end', function() { client.end() }); |
154 | | - |
155 | | -## Testing |
156 | | - |
157 | | -The tests are split up into two different Unit test and |
158 | | -integration tests. |
159 | | - |
160 | | -### Unit tests |
161 | | - |
162 | | -Unit tests do not depend on having access to a |
163 | | -running PostgreSQL server. They work by mocking out the `net.Stream` |
164 | | -instance into a `MemoryStream`. The memory stream raises 'data' |
165 | | -events with pre-populated packets which simulate communcation from an |
166 | | -actual PostgreSQL server. Some tests will validate incomming packets |
167 | | -are parsed correctly by the __Connection__ and some tests validate the |
168 | | -__Connection__ correctly sends outgoing packets to the stream. |
169 | | - |
170 | | -### Integration tests |
171 | | - |
172 | | -The integration tests operate on an actual database and require |
173 | | -access. They're under a bit more flux as the api for the client is |
174 | | -changing a bit; however, they should always all be passing on every |
175 | | -push up to the ol' githubber. |
176 | | -### Running tests |
| 36 | + cachedPreparedStatement.on('row', printRow); |
177 | 37 |
|
178 | | -You can run any test file directly by doing the `node |
179 | | -test/unit/connection/inbound-parser-tests.js` or something of the |
180 | | -like. |
| 38 | + cachedPreparedStatement.on('end', client.end()); |
181 | 39 |
|
182 | | -However, you can specify command line arguments after the file |
183 | | -and they will be picked up and used in the tests. None of the |
184 | | -arguments are used in _unit_ tests, so you're safe to just blast away |
185 | | -with the command like above, but if you'd like to execute an |
186 | | -_integration_ test, you outta specifiy your database, user to use for |
187 | | -testing, and optionally a password. |
| 40 | +### Philosophy |
188 | 41 |
|
189 | | -To do so you would do something like so: |
| 42 | +* well tested |
| 43 | +* no monkey patching |
| 44 | +* no dependencies (well...besides PostgreSQL) |
190 | 45 |
|
191 | | - node test/integration/client/simple-query-tests.js -u brian -d test_db |
| 46 | +### Installation |
192 | 47 |
|
193 | | -If you'd like to execute all the unit or integration tests at one |
194 | | -time, you can do so with the "run.js" script in the /test directory as |
195 | | -follows: |
| 48 | +Clone the repo. |
196 | 49 |
|
197 | | -##### Run all unit tests |
198 | | - |
199 | | - node test/run.js -t unit |
200 | | - |
201 | | -or optionally, since `-t unit` is the default |
202 | | - |
203 | | - node test/run.js |
204 | | - |
205 | | -##### Run all integration tests |
206 | | - |
207 | | - node test/run.js -t integration -u brian -d test_db --password password! |
208 | | - |
209 | | -##### Run all the tests! |
210 | | - |
211 | | - node test/run.js -t all -u brian -d test_db --password password! |
212 | | - |
213 | | -In short, I tried to make executing the tests as easy as possible. |
214 | | -Hopefully this will encourage you to fork, hack, and do whatever you |
215 | | -please as you've got a nice, big safety net under you. |
216 | | - |
217 | | -#### Test data |
| 50 | + git clone git://github.com/brianc/node-postgres |
| 51 | + cd node-postgres |
| 52 | + node test/run.js |
218 | 53 |
|
219 | | -In order for the integration tests to not take ages to run, I've |
220 | | -pulled out the script used to generate test data. This way you can |
221 | | -generate a "test" database once and don't have to up/down the tables |
222 | | -every time an integration test runs. To run the generation script, |
223 | | -execute the script with the same command line arguments passed to any |
224 | | -other test script. |
| 54 | +And just like magic, you're ready to contribute! <3 |
225 | 55 |
|
226 | | - node script/create-test-tables.js -u user -d database |
| 56 | +## More info please |
227 | 57 |
|
228 | | -Aditionally if you want to revert the test data, you'll need to "down" |
229 | | -the database first and then re-create the data as follows: |
| 58 | +Srsly check out the [[wiki]]. MUCH more information there. |
230 | 59 |
|
| 60 | +p.s. want your own offline version of the wiki? |
231 | 61 |
|
232 | | - node script/create-test-tables.js -u user -d database --down |
233 | | - node script/create-test-tables.js -u user -d database |
| 62 | + git clone git://github.com/brianc/node-postgres.wiki.git |
234 | 63 |
|
235 | | -## TODO |
236 | | - - Query results returned |
237 | | - - some way to return number of rows inserted/updated etc |
238 | | - (supported in protocol and handled in __Connection__ but not sure |
239 | | - where on the __Client__ api to add this functionality) |
240 | | - - Typed result set support in client |
241 | | - - simple queries |
242 | | - - bound commands |
243 | | - - edge cases |
244 | | - - [numeric 'NaN' result](http://www.postgresql.org/docs/8.4/static/datatype-numeric.html) |
245 | | - - float Infinity, -Infinity |
246 | | - - Error handling |
247 | | - - disconnection, removal of listeners on errors |
248 | | - - passing errors to callbacks? |
249 | | - - more integration testing |
250 | | - - bound command support in client |
251 | | - - type specification |
252 | | - - parameter specification |
253 | | - - transparent bound command caching? |
254 | | - - nice "cursor" (portal) api |
255 | | - - connection pooling |
256 | | - - copy data? |
257 | | - - kiss the sky |
| 64 | +__github is magic__ |
258 | 65 |
|
259 | | -## Why? |
| 66 | +### Why? |
260 | 67 |
|
261 | 68 | As soon as I saw node.js for the first time I knew I had found |
262 | 69 | something lovely and simple and _just what I always wanted!_. So...I |
263 | | -poked around for a while. I was excited. I told my friend "ah man |
264 | | -the only thing holding node back is a really solid data access story." |
265 | | -I mean...let's put the NoSQL debate aside. Let's say for arguments |
266 | | -sake you have to run a query from node.js on PostgreSQL before the |
| 70 | +poked around for a while. I was excited. I still am! |
| 71 | + |
| 72 | +Let's say for arguments sake you have to run a query from node.js on PostgreSQL before the |
267 | 73 | last petal falls off the rose and you are stuck as a beast forever? |
| 74 | +You can't use NoSQL because your boss said he'd pour a cup of |
| 75 | +Hoegarten into your laptop fan vent and you _hate_ that beer? |
268 | 76 | What if your entire production site depends on it? Well, fret no |
269 | 77 | more. And let [GastonDB](http://www.snipetts.com/ashley/mymusicals/disney/beauty-an-beast/images/gaston.gif) be vanquished. |
270 | 78 |
|
271 | 79 | I drew major inspiration from |
272 | 80 | [postgres-js](http://github.com/creationix/postgres-js). I didn't |
273 | 81 | just fork and contribute because it has |
274 | | -__0__ tests included with it and doesn't seem to be actively developed |
275 | | -anymore. I am not comfortable forking & playing with a project |
276 | | -without having a way to run a test suite, let alone using it in |
277 | | -production. |
| 82 | +_0_ tests included with it, adds a bunch of methods to the Buffer() |
| 83 | +object, and doesn't seem to be maintained. Still...was a lovely way |
| 84 | +to learn & excellent reference material. |
278 | 85 |
|
279 | 86 | I also drew some major inspirrado from |
280 | 87 | [node-mysql](http://github.com/felixge/node-mysql) and liked what I |
|
0 commit comments