Slides and demos created for the talk "Code Meets Art: Flutter for Creative Coding" done in FlutterCon Berlin 2024
Slides built with flutter_deck 🪄
During tech conferences, people attende talks and workshops to learn about crucial topics that they will hopefully take home and apply in their day-to-day careers to build better apps and software. With this talk, however, my goal is to inspire at least one person from the audience to go home, put aside all their responsibilities, their full time job, clients, product specs, and deadlines, and write code that has one goal and one goal only:
It paints beautiful pixels on the screen.
I truly believe that fascinating things can happen when you combine creativity with analytical and algorithmic thinking. When you do that, you become a better developer by being exposed to a universe of creative learning material. You learn data structures by learning how actual trees grow, you learn algorithms used in nature to fly birds or grow cells and use those algorithms to build your server infrastructure or improve a user interface. And being skilled at writing code that paints pixels unlocks your ability to build visualization tooling to help you learn complex concepts and algorithms much faster than you would have otherwise.
Huge thanks to Daniel Shiffman, the creator of The Coding Train for his fun and inspiring educational content. His coding challenge "Weighted Voronoi Stippling" was the main inspiration for this talk 🤩
Inspired by the original tutorial, the concepts of Delaunay triangulation, Voronoi Diagram, and lloyd's relaxation algorithm come together to apply a stippling effect on the camera input, using Flutter with its Canvas API to read the image pixel colors and paint the shapes on the screen.
I had the main demo running before the talk started, while people were coming in and finding their seats. And it was extremely heart-warming to see people interact and have fun with it 🥹
flutter-con-2024-pre-talk.MOV
The demo can be found in the app
folder of this repo, particularly in the camera_image_stippling_demo_page.dart
file.
I've also added controls to the demo UI. So you can run it, and adjust things like the stippling mode (dots, circles, voronoi polygons, ...etc), the points count (how far can you go before your machine goes 💥? 🫢), colors, dot sizes, ..etc. And you can take a screenshot and share it if you like!
fluttercon-24-controls-1-1920.mp4
fluttercon-24-controls-2-1920.mp4
fluttercon-24-controls-3-1920.mp4
For me, the whole point of the talk was the interactive demo experience. And it all depended on me setting up my phone in a way that its camera points at the audience to properly display the camera input and have the effect applied on it. That could have been easily achieved using a phone stand that I had, but being the clumsy, nervous person that I am, I forgot to bring it with me 🤦🏻♀️. Thankfully, I had the kindest stage manager who saw me struggling to adjust my phone in a paper cup, had a lightbulb moment, went away, and came back shortly after with a roll of scotch tape that perfectly did the job, so huge thanks to him for being a major part of making this interactive experience possible 🫡😁
The Delaunay
algorithm implementation included in this repo is take from the delaunay
Dart package. I chose to use the code of the package directly because I needed to extend it and modify it a lot. For example, adding _inedges
and _hullIndex
lists as well as the find()
method, of which the implementation was based on the D3 library's dellaunay.find()
method
With the limited time I had, I've extended the delaunay implementation with Voronoi diagram implementation that turned out a bit buggy. I've based it on the guide provided by the original library, and handled the infinite polygons at the canvas edges using the method explained in this tutorial, but for reasons beyond what my time allows me to investigate, some weird artifacts happen the edges. The optimal way to handle polygons at the edges would be, again, similar to how the D3 library handles it. So that's a ToDo for a time that may never come 🙈, or a great reason to contribute 👀
Instead of using the camera image streams, I have chosen to use a RepaintBoundary
and Ticker
combination to read the pixels of the screen. I chose this approach for a couple of reasons. The main one is that the image stream implementation of the camera_macos
package is buggy and causes the whole app to crash. Another reason is that this way basically any combination of widgets can be the base the stippling is performed on, so feel free to experiment with that! Lastly, with this approach it's easy to control the frame rate and limit the number of times in a second the reasing of the pixels happen for a better perfromance.
Unfortunately, due to many limitations, neither the slides nor the app with its demos run on web. One limitation is that the delaunay algorithm for some reason does not work properly on web, and I have not spent any time to investigate the issue.
Another major limitation is that reading image pixels from the camera is currently not possible on web. The camera_web
package still does not support image streams and reading pixel data from RepaintBoundary
wrapping the camera view is not possible.
I really wanted this to work on web so I can host it and make it easily accessible to everyone, but that's a problem that can hopefully be fixed on another day. If anyone has any solution contributions are highly welcome! 🤗
Here are some solution ideas I've thought of to make this work on web:
- Using live streaming to get the camera input from another device.
- Using poisson disk sampling to sample camera input and create a similar effect without the need for the delaunay or voronoi algorithms
- Using JavaScript interoperability to use the algorithms of the original JavaScript libraries.