Use PHP to Fetch data from a public api Store data in a mysql or postgresql database You can choose any public api you like. An example is http://api.krisinformation.se/ If you feel you have time Write a node application that retrieves the data that was stored in the database and present it in a suitable way. Source code should be placed in a Github repository.
Solution: xkcd cacher
with (quick) Docker setup to allow modern development/deployment workflow
http://localhost/xkcd.php?num=1934to get the xkcd with
numas JSON. See the status
messageto know where data is originating from, or if there were any errors.
- This will either return a cached response from the local Postgres instance - or fetch directly from the xkcd API.
- Try changing the
?numquery param to your favorite comic!
http://localhostfor further instructions and links.
- When you've cached a few comics, go to
http://localhost:3000to see them rendered using data from PostgreSQL.
To get up and running quickly, this solution is based on https://github.com/lvthillo/docker-php-postgres/
I figured that since I'm no Docker expert (yet), I should focus on what I know: PHP and Node.js.
For the Node.js + Postgres setup, I used parts of https://github.com/MichalZalecki/docker-compose-node-postgres and modified it to fit with my project.
Except for some struggles with setting up the Docker image for
php-app, I'm pretty happy with the end result.
It would have been really nice to get
GuzzleHttp working with the
php-app Docker image - but those could always be added later on if needed. The most important thing is that the project goals were fulfilled, and in a reasonable time once I finally got time to work on the project.
The project config with two files for secret variables -
env - is not optimal. But it works for now, and could always be refactored for a project that should be deployed to production.
- I spent about ~1h setting up the foundation for the docker-compose environment
- Then ~1h setting up PHP and trying to add
GuzzleHttp, but eventually giving up (for this time).
- Then ~2h developing the PHP backend to fetch & cache xkcd comics - and documenting the project, adding some structure and basic error handling. This took some extra time due to the lack of XDebug, and having to rely on slow logging debugging.
- Then ~2h to develop the Node.js frontend - all the way from fresh Docker image to completed module. This was by far the easiest, since JS is so much easier to work with.
- Finally ~30 min to analyze the solution and write these ideas for improvents.
First, make sure
docker-compose are installed. Then run the following to install dependencies and create a local environment.
$ docker-compose up -d
Access PHP app on http://localhost to get started.
Visit http://localhost:5050 to inspect PostgreSQL with pgadmin. Default credentials are:
- username: firstname.lastname@example.org
- password: mypassword
Add a connection:
Check the data:
The docker-compose.yaml is very flexible. By editing the enviroment variables inside the file you can define the following:
- POSTGRES_USER (default = dev)
- POSTGRES_PASSWORD (default = secret)
- POSTGRES_DB (default = db)
- DEFAULT_USER (default = email@example.com)
- DEFAULT_PASSWORD (default = mypassword)