I am a Software Engineer working at Google on the Google Assistant project.
Programming tools & languages are a special interest of mine and if you want to get me unreasonably excited talk to me about Smalltalk, Emacs, or everything else that provides an interactive development experience.
In the past I have build software construction kits and dynamic object systems and have researched user interfaces and architectures for live programming. Among other things, I've build lively.next and Lively Web.
ProjectsA selection of previous projects. Everything here is open source and can be found on Github.
lively.next is a web-based personal construction kit and development environment. Its user interface allows direct manipulation, composition, and behavior modification of objects similar to Smalltalk and Self. User content (workspaces and individual objects) are fully serializable so that users can simply create snapshots of their work, either for later retrieval or collaboration.
In the tradition of Smalltalk systems, lively.next is fully implemented in itself (client as well as server parts) and the entire system can be inspected and modified on the fly. This enables very short turn-around time for new features and makes for a joyful development experience.
Lively can be used for rapid prototyping or as a full-scale development environment that integrates well with third party systems. To learn more about lively.next visit the project page or dive into the live system.
This prototype implements a direct manipulation tile scripting interface that seamlessly produces a textual program representation.
Starting from an inspector that provides access to properties and behaviors of objects, a user can assemble programs by dragging and dropping tile representations of these properties or control flow structures. The assembled textual program can be modified directly or via further drag-and-drop operations since the program, even when presented in its textual form, can be converted back into tiles using its syntax tree.
The BrightTable prototype explored how a graphical user interface can be integrated into conventional workflows such as drawing and arranging items on a desk.
By projecting the user interface from the top and having a camera (and in a later version a Kinect sensor) detect objects and gestures, virtual objects can be controlled and real objects (such as drawings on a piece of paper) can be integrated in the virtual world.
The source code for the computer vision part can be found at github.com/rksm/BrightTable.
CodeChisel3D is an experimental development experience for VR. It is based on three.js and can be used by visiting a web page with a WebGL-enabled browser.
See the project page for demos and more information.
Cloxp is a live, Smalltalk-like development environment for Clojure.
In the tradition of Lisp systems, Clojure provides a powerful meta system API that can be used to implement tools like system browsers, inspectors, and live workspaces. See the project page and my Clojure/West talk for more details. A live version of cloxp is running at cloxp.lively-next.org.