<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Semicolons &amp; Side Projects</title><link>https://filbot.com/</link><description>Recent content on Semicolons &amp; Side Projects</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Tue, 10 Mar 2026 16:34:28 -0700</lastBuildDate><atom:link href="https://filbot.com/index.xml" rel="self" type="application/rss+xml"/><item><title>International Space Station Tracker</title><link>https://filbot.com/international-space-station-tracker/</link><pubDate>Tue, 10 Mar 2026 16:34:28 -0700</pubDate><guid>https://filbot.com/international-space-station-tracker/</guid><description>&lt;figure style="margin: 1.5rem 0;"&gt;
 &lt;img
 src="https://filbot.com/international-space-station-tracker/iss-tracker_hu_2014fec6fb66f1f7.webp"
 loading="lazy"
 decoding="async"
 style="max-width: 100%; height: auto;"
 alt="ISS tracker device mounted on wall" /&gt;
 
 &lt;/figure&gt;


&lt;p&gt;I had an old Raspberry Pi 3b not doing much of anything so I went looking for a project to give it a purpose. Turns out there are a couple of free APIs that provide the current location of the International Space Station as well as who is currently in space. The software for this project is written in Python but you don&amp;rsquo;t need to know how to program in any language to get this running. I&amp;rsquo;ve included instructions on my &lt;a href="https://github.com/filbot/iss-tracker"&gt;Github repository&lt;/a&gt; and the project also has a theme file to change certain aspects of the display. The 3D files are available on &lt;a href="https://makerworld.com/en/models/2510676-iss-tracker-housing#profileId-2761043"&gt;Makerworld&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>Hero Dashboard</title><link>https://filbot.com/hero-dashboard/</link><pubDate>Mon, 09 Feb 2026 05:03:55 -0700</pubDate><guid>https://filbot.com/hero-dashboard/</guid><description>&lt;figure style="margin: 1.5rem 0;"&gt;
 &lt;img
 src="https://filbot.com/hero-dashboard/dashboard-angle_hu_1179fd3f5bd7d432.webp"
 loading="lazy"
 decoding="async"
 style="max-width: 100%; height: auto;"
 alt="Hero Dashboard" /&gt;
 
 &lt;/figure&gt;


&lt;p&gt;The Seattle city government provides a public API for fire department emergency dispatch data. There’s a very cool project, &lt;a href="https://sfdlive.com"&gt;SFD Live&lt;/a&gt;, that visualizes this information (and a lot more) with a ton of features.&lt;/p&gt;
&lt;p&gt;What I wanted, though, was something much simpler: a stripped-down view of the data that would be interesting, if not especially useful, on a dashboard I could glance at from across the room. I had an old monitor and a spare Raspberry Pi lying around, so slapping those together and running a web app in kiosk mode on the Pi felt like the perfect excuse for a small side project.&lt;/p&gt;</description></item><item><title>Stark Medical Scanner... sort of</title><link>https://filbot.com/stark-medical-scanner...-sort-of/</link><pubDate>Mon, 10 Nov 2025 05:03:55 -0700</pubDate><guid>https://filbot.com/stark-medical-scanner...-sort-of/</guid><description>&lt;figure style="margin: 1.5rem 0;"&gt;
 &lt;img
 src="https://filbot.com/stark-medical-scanner...-sort-of/hero_hu_3c5cf53c6ea38049.webp"
 loading="lazy"
 decoding="async"
 style="max-width: 100%; height: auto;"
 alt="AI generated image of the Stark Medical Scanner" /&gt;
 
 &lt;/figure&gt;


&lt;p&gt;Let’s get this out of the way — what is “gravy blood”, and who is “Damp”?&lt;/p&gt;
&lt;p&gt;This whole project started as an inside joke. My buddy and I would say we had gravy blood whenever we overindulged in a meal. That sluggish, heavy feeling afterward? In our lore, that’s the gravy blood kicking in.&lt;/p&gt;
&lt;p&gt;Then one night we were watching Iron Man 2 and saw the Stark Medical Scanner. Perfect, we joked that we needed our own version to check our gravy blood levels.&lt;/p&gt;</description></item><item><title>Realtime BART Arrival Display</title><link>https://filbot.com/realtime-bart-arrival-display/</link><pubDate>Sun, 09 Nov 2025 08:03:55 -0700</pubDate><guid>https://filbot.com/realtime-bart-arrival-display/</guid><description>&lt;figure style="margin: 1.5rem 0;"&gt;
 &lt;img
 src="https://filbot.com/realtime-bart-arrival-display/hero_hu_38c51d4c16a5f0b0.webp"
 loading="lazy"
 decoding="async"
 style="max-width: 100%; height: auto;"
 alt="AI generated image of the physical BART display" /&gt;
 
 &lt;/figure&gt;


&lt;p&gt;I have a love-hate relationship with BART. I’m grateful for it, but let’s just say it’s not always the most reliable so it&amp;rsquo;s nice to see before hand when the train you need is due to arrive. There are plenty of projects out there that show real-time BART arrival information. This one does that too, it’s nothing groundbreaking, but I wanted to build my own version that captures the vintage BART platform sign vibe I associate with commuting between the East Bay and my job in the city.&lt;/p&gt;</description></item><item><title>RTSP Video Stream on Raspberry Pi</title><link>https://filbot.com/rtsp-video-stream-on-raspberry-pi/</link><pubDate>Mon, 15 Sep 2025 08:03:55 -0700</pubDate><guid>https://filbot.com/rtsp-video-stream-on-raspberry-pi/</guid><description>&lt;figure style="margin: 1.5rem 0;"&gt;
 &lt;img
 src="https://filbot.com/rtsp-video-stream-on-raspberry-pi/hero_hu_f066b094e3dedea.webp"
 loading="lazy"
 decoding="async"
 style="max-width: 100%; height: auto;"
 alt="ChatGPT generated image of the cube" /&gt;
 
 &lt;/figure&gt;


&lt;p&gt;My doorbell camera provides an RTSP stream, which made me think it would be perfect to display on a small dedicated screen. I built this using Jay Doscher’s excellent &lt;a href="https://www.doscher.com/arm-terminal-2-the-cube/"&gt;ARM Terminal 2 - The Cube&lt;/a&gt;. I highly recommend taking a look at his blog, because if you&amp;rsquo;re reading this, you&amp;rsquo;re gonna love his stuff, &lt;a href="https://www.doscher.com"&gt;doscher.com&lt;/a&gt;.&lt;/p&gt;
&lt;hr&gt;
&lt;h2 id="the-hardware"&gt;The Hardware&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Raspberry Pi 4 running Raspberry Pi OS Lite (Bookworm)&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.amazon.com/dp/B08N5VZN8R?ref=ppx_yo2ov_dt_b_fed_asin_title&amp;amp;th=1"&gt;Geekworm Raspberry Pi 4 Heatsink&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.waveshare.com/wiki/4inch_HDMI_LCD_(C)"&gt;Waveshare 4inch HDMI LCD (C)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.doscher.com/arm-terminal-2-the-cube/"&gt;ARM Terminal 2 - The Cube&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Repurposed metal adjustable arm - &lt;a href="https://www.amazon.com/dp/B08FNQC6TZ?ref_=ppx_hzsearch_conn_dt_b_fed_asin_title_6&amp;amp;th=1"&gt;Example on Amazon&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;



 




 
 &lt;figure style="margin: 1.5rem 0;"&gt;
 &lt;img
 src="https://filbot.com/rtsp-video-stream-on-raspberry-pi/top-down-labels_hu_55829989bdab5075.webp"
 loading="lazy"
 decoding="async"
 style="max-width: 100%; height: auto;"
 alt="Labeled photo of hardware" /&gt;
 
 &lt;/figure&gt;


&lt;p&gt;My metal arm originally had a lamp attached, so I removed the lamp and saved it for parts. The clamp that came with the arm is not the strongest, but it works fine for the weight of this project. With a little creativity, you can attach it almost anywhere. I mounted mine to a monitor arm so I could position the screen just above my secondary monitor.&lt;/p&gt;</description></item><item><title>Nearby Aircraft Display</title><link>https://filbot.com/nearby-aircraft-display/</link><pubDate>Tue, 09 Sep 2025 19:05:55 -0700</pubDate><guid>https://filbot.com/nearby-aircraft-display/</guid><description>&lt;figure style="margin: 1.5rem 0;"&gt;
 &lt;img
 src="https://filbot.com/nearby-aircraft-display/chatgpt-header-image_hu_a937f92e7d0723bf.webp"
 loading="lazy"
 decoding="async"
 style="max-width: 100%; height: auto;"
 alt="ChatGPT generated image of Flight Display" /&gt;
 
 &lt;/figure&gt;


&lt;p&gt;I have an &lt;a href="https://filbot.com/piaware-data-display/"&gt;ADS-B node&lt;/a&gt; feeding data into the vast network of flight trackers, but it mostly just sits on my roof humming along without any interaction from me. I can, and have, looked at the webpage that shows my local flight data and the planes on a map, but it didn’t quite have the whimsy I was after. I decided a wall-mounted device that displayed the aircraft type closest to my location at any given moment would be a fun build.&lt;/p&gt;</description></item><item><title>Understanding JCR in AEM</title><link>https://filbot.com/understanding-jcr-in-aem/</link><pubDate>Wed, 02 Apr 2025 07:47:37 -0700</pubDate><guid>https://filbot.com/understanding-jcr-in-aem/</guid><description>&lt;figure style="margin: 1.5rem 0;"&gt;
 &lt;img
 src="https://filbot.com/understanding-jcr-in-aem/header_hu_72aa6ecc69f662b4.webp"
 loading="lazy"
 decoding="async"
 style="max-width: 100%; height: auto;"
 alt="ChatGPT generated image lego ad" /&gt;
 
 &lt;/figure&gt;


&lt;p&gt;If you’re brand new to Adobe Experience Manager (AEM), one of the first things you’ll hear about is something called the &lt;strong&gt;JCR&lt;/strong&gt;—short for &lt;strong&gt;Java Content Repository&lt;/strong&gt;. That might sound intimidating, especially if you don’t have a Java background (I didn’t either), but don’t worry. This post is here to give you a simple, high-level mental model.&lt;/p&gt;
&lt;p&gt;And the model we’re going to use? Lego.&lt;/p&gt;</description></item><item><title>AEM Developer Cheat Sheet</title><link>https://filbot.com/aem-developer-cheat-sheet/</link><pubDate>Mon, 31 Mar 2025 16:34:28 -0700</pubDate><guid>https://filbot.com/aem-developer-cheat-sheet/</guid><description>&lt;figure style="margin: 1.5rem 0;"&gt;
 &lt;img
 src="https://filbot.com/aem-developer-cheat-sheet/aem-developer-cheat-sheet-header-image_hu_60d4b3ff176fdd27.webp"
 loading="lazy"
 decoding="async"
 style="max-width: 100%; height: auto;"
 alt="ChatGPT generated image of man riding Adobe logo" /&gt;
 
 &lt;/figure&gt;


&lt;p&gt;Just a little something I wish I had when I first started working with AEM. Adobe Experience Manager can feel like a real chicken-and-egg situation—Adobe doesn’t provide an easy way to set up AEM instances for experimentation, so you’re kind of forced to learn on the job. That’s a tough spot to be in when a company is paying you to already know how to do AEM work.&lt;/p&gt;</description></item><item><title>About</title><link>https://filbot.com/about/</link><pubDate>Sat, 29 Mar 2025 11:47:25 -0700</pubDate><guid>https://filbot.com/about/</guid><description>&lt;p&gt;Hey, I’m a frontend developer whose day job is building modern web experiences in Adobe Experience Manager (AEM) environments. Outside of work, I can be found experimenting with CAD, 3D scanning and printing, and microcontroller projects—usually with more whimsy than usefulness. This blog is a space for documenting personal builds, sharing development insights, and exploring the overlap between software, hardware, and design.&lt;/p&gt;</description></item><item><title>PiAware Data Display</title><link>https://filbot.com/piaware-data-display/</link><pubDate>Sat, 01 Mar 2025 19:05:55 -0700</pubDate><guid>https://filbot.com/piaware-data-display/</guid><description>&lt;figure style="margin: 1.5rem 0;"&gt;
 &lt;img
 src="https://filbot.com/piaware-data-display/piaware-lcd-chatgpt-generated_hu_e52aef69b3439081.webp"
 loading="lazy"
 decoding="async"
 style="max-width: 100%; height: auto;"
 alt="ChatGPT generated image of ADS-B Node" /&gt;
 
 &lt;/figure&gt;


&lt;p&gt;After setting up my PiAware receiver to track live flight data, I wanted a simple way to glance at some key stats without SSH-ing into the Pi or opening a browser. Enter the humble 20x4 I2C LCD screen.&lt;/p&gt;
&lt;p&gt;This post walks through how I wired up a character LCD to my Raspberry Pi, wrote some Python to pull flight data from PiAware, and got a tiny dashboard updating every second—all without needing to touch a GUI.&lt;/p&gt;</description></item><item><title>MicroMV on OS X in 2018</title><link>https://filbot.com/micromv-on-os-x-in-2018/</link><pubDate>Fri, 05 Oct 2018 11:51:48 -0700</pubDate><guid>https://filbot.com/micromv-on-os-x-in-2018/</guid><description>&lt;p&gt;&lt;img src="micromv-header.png" alt="micromv-header-image"&gt;&lt;/p&gt;
&lt;p&gt;What you need:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Apple Thunderbolt to FireWire Adapter&lt;/li&gt;
&lt;li&gt;MicroMV playback device with FireWire port&lt;/li&gt;
&lt;li&gt;FireWire cable&lt;/li&gt;
&lt;li&gt;MicroMV tapes with footage&lt;/li&gt;
&lt;li&gt;Apple computer running OS X 10.6 or higher&lt;/li&gt;
&lt;li&gt;AVCVideoCap software (&lt;a href="https://nofile.io/f/mNaGHHh9297/AVCVideoCap.app.zip"&gt;download&lt;/a&gt;)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Once you have all the hardware needed to connect everything up to a computer, turn it all on, insert the tape you want to ingest, and fire up the free AVCVideoCap application (download), which is actually a small app written by Apple to show off some capabilities of the FireWire protocol in XCode back in the day when FireWire was a thing. If when opening the app, you get a message like:&lt;/p&gt;</description></item></channel></rss>