My programming journey


This is the story of how I got into programming and the things I’ve learned along the way that have helped me become a better programmer.

As a first year PhD student I needed to do data analysis and visualization which was beyond the capabilities of Excel. As someone who did not program at the time, the recommendation from my colleagues was to use OriginLab, an uglier and clunkier version of Excel with slightly more extensive plotting and regression libraries. I had used this software in my undergraduate degree and absolutely hated how frustrating it was to use. There were many manual steps involved in data analysis and the spreadsheet-like interface made for poor reproducibility. For this reason I decided to learn a programming language instead. Initially I was planning on using MatLab because some people in my research group used it, but a friend from the math department advised me that if I were to learn a programming language it should be a real programming language and suggested Python instead.

I’m very grateful to him because learning Python unlocked a world of fun and learning that would not have been possible with MatLab. Pretty soon after learning to do basic graphing and data analysis in Python I was automating lab equipment with a Raspberry Pi and running scripts to scrape the web, neither of which I was likely to be able to do in MatLab.

Thanks to my new found programming and hardware automation skills I was able to hack together a variable temperature impedance spectroscopy experiment which allowed me to collect data in much higher resolution than I would have if I were collecting data manually. I was able to set up the experiment in the morning and leave it running for the day and come back to hundreds of frequency sweeps collected over a range of temperatures.

With all this data it was also important to learn how to automate the analysis. Traditionally scientists in my field would use off-the-shelf software to analyze the impedance data, but they required some manual intervention where the user would specify the initial parameters of the fitting algorithm using a graphical point-and-click style input. Since I had hundreds of datasets it would have taken me a very long time to do this manually, so I developed my own mathematical heuristics to initialize the fitting algorithm based on the observed data. I could use these heuristics to programmatically estimate the initial parameters and then use my own fitting algorithm to fit the data. Once this algorithm was developed I was able to fit hundreds of datasets within a few minutes, and visually scan through the resulting fits to ensure they were correct.

I loved the feeling of using mathematics and programming to speed up my scientific work. It felt like I had magical powers when I could go faster than the other scientists around me who were not creating their own tools. After this experience I started to set my sights on a larger efficiency prize, instead of just optimizing the manual data collection and analysis of the specific material I was studying, I would try to use my programming skills to help me focus on the right materials. My whole PhD revolved around studying fast-ion conducting ceramics. It was very hard to choose which ceramic to study, as there are practically infinite combinations of chemicals you can imagine. Coming up with heuristics to choose new promising candidates to study was an interesting problem. So I set out to learn about algorithms which could help me identify promising materials to study experimentally.

The age-old problem in computational physics and chemistry is that simulating interactions between atoms is computationally expensive and so running a full simulation with the highest fidelity mathematical models is extremely computationally expensive, and as a PhD student in an experimental lab I didn’t have the computational resources to test this kinds of algorithms by myself. However, there are simplified models which are much more efficient to run on an individual laptop and I was able to find some promising ones in academic publications and replicate their work. Using these algorithms I was able to scan through a database of known ceramic materials and identify materials which I later proved were solid ion conductors. Since this was replicating others work, and the solid ion material I identified did not display properties that were of interest in state of the art I did not pursue this finding as my PhD funding was coming to an end and I was thinking about the next steps in my career. But toward the end of my PhD I got a taste for using programming to not only make your work more efficient, but help it guide you in what work you actually focus on.

Part of me wishes I had leaned in to the computational work, but as an experimentalist who was a self taught programmer I felt imposter syndrome and did not feel qualified to get a spot in computational academic labs studying battery materials. As fun as it was to learn some lab chemistry, I found the experimental side to be quite monotonous and wanted to improve my programming skills, so I decided to set out on my own and found a software company.

Toward the end of my PhD I decided it would be really fun to build a software company. I was programming in all my free time and this was a chance to spend more time doing that and also retain the self-directed work that I had come to love as part of my PhD (I was very fortunate to have a supportive supervisor who encouraged me to take decide on my own research direction). In my free time I was spending a lot of time on working on some ideas for an app, and put together a pitch deck to shop around to investors and potential co-founders. I did some trial work with a few potential co-founders, as part of this time I learned a lot of the fundamentals of shipping a web app and working with docker. I was introduced to Rudy Lai, my cofounder at Tactic who had a passion for web scraping which I shared. I had been obsessed with learning the fundamentals behind how Google worked because it was such a life-changing technology for me. And when Rudy pitched me on the idea of working on using web-search for application in enterprise I was sold on the idea because it would give me the opportunity to learn more about how it all worked, and I thought his vision for commercial application was fairly compelling so I decided to go all in.

In those early days of Tactic I learned a lot about writing software from Rudy, who had a formal CS degree, but we made a lot of silly mistakes and we didn’t know how to ship very quickly. Rudy had done a lot of UX and frontend work in his past so I stuck with working on backend code initially.

For the first few months we made a common mistake and stayed heads down coding without talking to users, and then as our runway was getting short we realized we needed to try to sell it, so we started trying to do some sales, and immediately realized that what we had built was not resonating with our target audience. We had a grand vision for what the product could be but there were so many technical unknowns which did not seem solvable in the time we had left that we realized it was not the right direction to continue pursuing. We wisened up at this point (after studying the YC playbook for the second time some of the lessons started to sink in) and made a rule where we would not write another line of code until we were sure we could make money from the code in less than a week.

This was a turning point for our style of coding. We went from test driven development with an overly burdensome deployment pipeline (for a team of two) to a hacker’s mindset, maniacally focused on the output of the code without worrying too much about what the code itself looked like. Within a few months of code freeze and sales calls we stumbled upon a set of customers willing to put down a few hundred dollars for us to scrape some web data for them. This job was run with a nothing more than a Jupyter notebook. We repeated this kind of sale several times, each time making small adjustments to the notebook. Soon demand was growing and customers wanted more than a spreadsheet via email, they wanted a system where they could schedule more web-scraping and interact with the data, so we built a web app on top of the Jupyter Notebook outputs.

We hired in a friend of Rudy’s, Alex Sparrow, who was doing some contracting work at the time. He is a fellow physicist and ex Palantir, and was a really great person to learn from. In working side-by-side building this production web app for our first 100 users, he helped me improve my engineering skills immensely. When we started making decent money he also helped us hire in more engineers into our team, making sure we were raising our bar with every hire.

In those very early days the web app was run via a single EC2 instance on AWS. We would ssh into the machine, start a tmux session, pull the latest code from GitHub, and run the code directly on the VM. Eventually we containerized and ran our code via docker-compose, and spun up a separate instance for our web app and data pipeline. Long after the product was productionized we were still manually intervening in almost every long-running job users triggered from our web app. This was a formative period for me as a programmer because there were very few guardrails and I had a lot of control over our system.

As our team and customer base grew so did our infrastructure, devops, and information security requirements. I became an expert in these areas (with the help of my awesome team), mapping our infrastructure in Terraform, setting up isolated staging environments for long-running smoke tests, migrating us to an automated deployment pipeline, and setting up an elastically scalable Kubernetes cluster with groups of CPUs and GPUs, triggered by an event driven architecture.

Then our startup went through a crisis which opened the door to a new opportunity for me as a programmer. The Covid bubble had popped and our target market were firing their sales teams (our target customer) and our growth was flatlining. Up until this point my cofounder, Rudy, had owned product, design, and frontend. He was in every meeting with the frontend team and the designers, but this crisis pushed us to have him hand over this responsibility to me so he could fully focus on growth. This was a steep learning curve for me because my frontend skills were not super strong, but fortunately I had some natural ability in design.

I have always loved graphic design, and spent a lot of time in vector graphics programs for as long as I’ve had a computer. When I took over design at Tactic we had just lost our in-house designer who had helped us lay the foundation of our design system. We brought in a 3rd party design agency I found working with them like pulling teeth. Iteration times were too slow for our fast paced engineering team. Over a Christmas break I spent some time practicing Figma and reading Refactoring UI by Adam Wathan and Steve Schoger (the creators of Tailwind). By the end of the break I had finished a few projects we had on the backlog for our design agency in a fraction of the time it would have taken them. My work was very well received by my cofounder and the rest of the team, so we made the call to cut the design agency and I would fill this role until we were able to replace them.

With all this extra responsibility on my plate I spent a lot less time coding, so my programming skills atrophied a bit, but working side-by-side with the frontend team as designer gave me a valuable perspective which prepared me to take on my most ambitious programming adventure yet.

After the wind-down of my first startup I set out to build a Jupyter Notebook client application. Initially prototyped it in Swift only to find the open-source code-editor support had a heavy center of gravity in the Javascript/Typescript world, so I switched to Electron/React. I love Jupyter but am extremely frustrated by how clunky and ugly JupyterLab looks. Even the Jupyter extension for VS Code makes me angry with how cluttered and unfocused it feels. I had the confidence from my time designing at Tactic that I could get the look I wanted, but getting the feeling was a whole other matter. Making a developer tool feel good is about fast rendering speeds, easy configuration, and making it keyboard driven.

Initially my React skills were pretty rough, but I managed to launch an MVP on Hacker News that did quite well and brought in a decent number of users. I was pretty pleased that I was able to build something good enough for people to switch over from a battle-tested and loved JupyterLab (and VS Code). That felt like a testament to my newfound frontend programming skills.

Next up, I pushed for a deep integration with the LLM api providers, and eventually setup my own backend to handle subscriptions (for my Pro tier) and a proxied LLM api for pro users. Now I’m setting out to build a cloud-hosted GPU feature which will just work out of the box. It does feel like I’ve created a magical experience here and I’m really proud of it.

Anyway, this is my story of learning to code. I’m very grateful for my teammates at my first startup, Tactic, who helped me become the programmer I am today. And grateful for the opportunity to work on Satyrn solo which has given me an immense feeling of ownership and confidence.