Small chat application demonstrating asynchronous responses in Bottle using gevent
Pull request Compare This branch is 3 commits behind webwurst:master.
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
_res
.gitignore
README.md
bottle.py
chattle.py

README.md

Chattle Chat App

(this is work in progress, please improve me)

Chattle is a small chat app to demonstrate the use of asynchronous responses with the webframework Bottle. It uses gevent to allow many concurrent connections from chat clients to the chat server.

Dependencies

Install bottle and gevent with pip install bottle gevent.

$ python chattle.py

Open different Browsers/Windows and start chatting!

If you want more control over the server setup, you may use pip install gunicorn:

$ gunicorn --bind 0.0.0.0:8080 --worker-class gevent --workers 1 chattle

See Gunicorn documentation for more options like daemonizing or dropping privileges when started as root.

If you increase the number of workers you can experience the weirdness happening when every worker maintains his unshared copy of chat-history..

How It Works

Server and Client only communicate via Json.

The client establishes a connection to the server and waits until a message arrives. It processes the message and starts a new connection to the server and waits until a message arrives and so on..

The rendering of the Html-Output is done via jsviews.

The Server uses Even.wait() from gevent.event to spawn a Greenlet if there are no new messages to send to the client. If a chat message arrives Event.set() awakes all waiting Greenlets so they can send out the new message to their corresponging clients.

The messages on the server are kept in memory. So when running multiple workers these message-histories get out of sync. Persisting to the rescue..