❌

Normal view

There are new articles available, click to refresh the page.
Before yesterdayMidnight Cheese Blog

First Trip to Hamvention

2 June 2024 at 00:00

First Trip to Hamvention

June 01, 2024

I've held an amateur radio license since 2012, but this year marked the first time making the trip up to Dayton, Ohio for Hamvention. Hamvention is the largest ham radio convention in the US. Overall it was a fun experience connecting with people that I've interacted with online for years and years. It was great to meet-up face-to-face.

While there's plenty of fancy new gear to pine over in the exhibition buildings and lots of old gear to root through in the flea market, the people are really what makes the trip worth while. For this trip I challenged myself to conduct a handful of audio interviews for one of the ham radio news websites I run. Getting to connect with people behind some of the largest (AMSAT) and technical (M17) entities in the hobby was fun.

I wish I had experienced Hamvention in the old days at Hara Arena even though everyone I brought it up to said I wasn't missing much, especially in the later years. Still, it would have been fun to experience Hara both in its heyday and toward the end.

I had a chance to participate in a small Mastodon meet-up. Since ditching Twitter in 2022, Mastodon has been my social media home base. There's a really good group of hams that make up that community.

The homogenous nature of the ham community does worry me and it was on full display in Xenia. The US is becoming more and more diverse, but we're not seeing that in the ham radio community. Convention attendee numbers are up, while license numbers are down. What does the community look like both in numbers and diversity once the current generation moves on?

Regardless, I'm anxious to see what next year brings. Is it worth driving up to Dayton year after year? I would say yes! Though I think one day is plenty for me. Half way through day two and my battery was more than depleted. Beyond Hamvention I'll continue to do my part through the newsletter and news site to promote the up and coming technology, communities, and future stars of the hobby. I can't wait to see what's next.

Huge thanks to the Dayton Amateur Radio Association for their encouragement to attend as well as their invitation to the Hamvention Awards Banquet. It was great to meet some of the outstanding people helping to move the hobby forward.

Contributing to the Internet Archive's DLARC

7 January 2024 at 00:00

Contributing to the Internet Archive's DLARC

January 06, 2024

The Internet Archive recently added Amateur Radio Weekly (ARW) to their Digital Library of Amateur Radio & Communications (DLARC) collection and I couldn't be more excited! I've been producing ARW since 2014 (minus a two year pause to focus on grad school) collecting the most interesting stories and projects related to ham radio each week and sending that summary out as an email newsletter. Needless to say, it's an honor to be added to this special collection.

As the web matures we're beginning to see more discussion around preserving content as creators age beyond this life. It's certainly something I've thought about over the last few years, which is why the inclusion into DLARC is so wonderful. There's an opportunity for the content that I've created to live on for other generations to explore. If nothing else, each newsletter is a time capsule of that period in time.

Kay Savetz, the curator behind DLARC recently published an update in an issue of Zero Retries and it provides a great overview of what the Internet Archive is trying to accomplish, and how far they've come since starting this project in 2022. This update calls out some incredible finds from some of the biggest contributors to the ham radio hobby. It's humbling to have my newsletter mentioned in the same article.

Newsletters, meeting minutes, and other documents from the KARO-ECHO Ham Radio Association, based in central California, are now in DLARC. We’ve added all 15 issues of Hambrew magazine, "for amateur radio designers and builders," which was published by George DeGrazio WF0K (SK) 1993 through 1997. And, a complete archive of Amateur Radio Weekly, Cale Mooth K4HCK’s wonderful email newsletter that highlights exceptional ham radio content on the web.

I noticed a collection of Apple II software that's recently been included and now I'm ready to pull the old Apple out of the closet and try to receive RTTY on that old hardware!

It's worth mentioning that the entire DLARC project wouldn't be possible without the support of Amateur Radio Digital Communications (ARDC), a leading organization that funds forward-facing projects across the hobby.

Here's to many more years of Amateur Radio Weekly, and many more lifetimes of exploration provided by the Internet Archive's DLARC!

First POTA Activation

24 December 2023 at 00:00

First POTA Activation

December 23, 2023

Parks on the Air (POTA) is one of the best things going for the Ham Radio hobby. It's uncomplicated, fun, and the web app that is central to the entire system is a wonderfully modern technical implementation and user experience.

Today, I activated a park for the first time and I can see why so many Hams make activations a regular event. Dedicating 90 minutes outside while playing radio with a nice view of nature is tough to beat.

As far as first activations go, I tried to set myself up for an easy first try. We happened to be staying at Paris Landing State Park (K-2965) for a few days so I knew I would have plenty of time to give an activation a try. Even if I failed the first attempt, I knew I could try again in a few hours or even the next day. Fortunately, everything would work out on the first try.

Weather was just about perfect. 55 degrees, little to no wind, and broken clouds kept the temperature manageable. I'm terrible when it comes to the cold, so this was getting toward the no-go end of things for me! The park didn't have much activity, so it was easy to find a prime picnic bench by the edge of Kentucky Lake. 10 minutes to unload, set up the antenna and radio, and I was ready to get on the air. I had been practicing with a few backyard "activations" leading up to this weekend, so I felt comfortable with the setup process. (Those activations did not go well, but that's another blog post.)

Amateur radio setup overlooking Kentucky Lake at Paris Landing

POTA setup overlooking Kentucky Lake at Paris Landing.

The only real hang-up of the day was getting my laptop's clock synced with a time server. I used the laptop just 10 minutes before my activation while connected to a reliable network and assumed the clock was synced up, but that wasn't the case once I started operating. The laptop's clock was around 2 seconds slow compared to other operators on the band, and that was enough to hinder any hopes of an FT8 QSO. Thankfully, my phone (barely) had a data signal and I was able to use its hotspot feature to get a connection to my laptop and get the clock synced. I don't know if the sync service just wasn't running or if the clock drifted that much in the 10 minutes it went unconnected from the internet, but something to account for next time.

View of a Buddipole overlooking Kentucky Lake at Paris Landing State Park

Buddipole overlooking Kentucky Lake at Paris Landing State Park.

From backyard testing I knew the Buddipole antenna works well on 15 meters so I started there. It was slow going at first and I wondered if I'd be able to get the 10 required QSOs to get full activation credit, but ended up landing all 10 required QSOs within 37 minutes. I budgeted 90 minutes so with my remaining time I switched over to 20 meters and worked 7 more stations over the next 15 minutes. Of the 17 total QSOs operating at 5 Watts, I would estimate another 5 QSOs that I wasn't able to complete. Interestingly, the final QSO map favored the east cost of the US.

17 QSOs plotted on a map of North America

Activation QSOs mapped.

Signal reports from North America to Europe

Signal reports during my activation.

All-in-all, this activation went very well. The few bumps were relatively easy to overcome. Had I not done a few practice runs in the backyard over the last couple weeks, this would not have been successful at all. I learned a lot from those test runs. For future activations, I think it's time to slim down my overall setup. The Gator Box works well, but I have a bunch of extra equipment in there because it usually sits on my desk at home as my main station. The heavy power supply, power distribution module, Raspberry Pi, and associated cabling could all go. Something to tweak in the coming months.

Thanks to all the POTA supporters out there! This was a fun activity and I definitely see myself doing this again.

For those interested, this is my gear list for this particular activation:

  • Radio: Yaesu FT-857D
  • Antenna: Buddistick PRO
  • Antenna Tuner: MFJ-929 (Shouldn't need this, but I haven't yet taken the time to test without it.)
  • Battery: Bioenno LiFePO 3A/36 Watt-Hours
  • Case: Gator Case 4U Audio Rack, Shallow

Highway 79 bridge over Kentucky Lake at Paris Landing State Park

Highway 79 bridge over Kentucky Lake at Paris Landing State Park.

Sunrise at Paris Landing State Park

Sunrise at Paris Landing State Park.

The State of Linux on a PowerMac G5 October 2023

27 September 2023 at 00:00

The State of Linux on a PowerMac G5 October 2023

September 26, 2023

TL;DR Debian 12 Sid (PPC64) will install on an early PowerMac G5 start to finish. Download, then burn the ISO to CD, hold down the "c" key on startup, run through the installer and all should be well. More info on the MacRumors Forum.

First of all, this entire article is dumb because it's all about trying to run a modern Linux server OS on a 20 year old machine running a CPU architecture (PowerPC) that has largely left the consumer market. There's really no reason this machine should support anything modern, but thanks to some tireless engineers, a lot of modern software can still run on a what is still a fairly capable machine.

I picked up a PowerMac G5 on a whim for the nostalgia factor. While back in the day I owned Apple hardware with a G3 and G4 chip running as my daily driver, I never did own a G5. I jumped straight from a Titanium PowerBook G4 to one of the very first Intel MacBook Pros. I did work with a few G5 machines at work.

Anyway, besides this machine being old, it's also power hungry, running well over 100 Watts at rest. Two factors against it potentially running as a home server. The Libre Le Potato on my pegboard will outperform this machine in almost every way. Bananas.

So I wanted to try my hand at running my own Pixelfed instance and decided the G5 is the machine. To make this a reality, I would need to run Linux. Well, most Linux distros no longer officially support older PowerPC architecture, in this case, PPC64. PPC64 is different from the more modern series of POWER chips running PPC64LE. In fact, reading through the top hit when searching for PPC Linux distros, most of the top 10 distros listed no longer support PPC64 at all.

However, I eventually found the Fienix project, a Debian based distro that aims to keep old Apple PowerPC machines afloat with a recent Linux build. This seemed promising even though it appears to be run by one individual with no community around it. As you might guess, this is where things went south.

The best way to install a new OS on New World Apple machines is via optical disc media. Trying to boot a USB drive via Open Firmware is an absolute nightmare. Devices and device paths are incredibly cryptic. I never could boot Fienix from USB media. I tried multiple USB sticks, all the different device aliases, followed the special video, but no luck. I did happen to have some DVD disk media in the back of the closet, but it had all degraded to the point where I couldn't burn a working copy. I must have burned a half dozen or more DVDs but they all produced corrupt discs. The installer would only get so far before hitting read errors. If they booted at all. So no USB, and no DVD based media. Fienix was out.

Mistakenly, I then turned to Gentoo. I previously stated that I was completely done with Gentoo, but what other option did I have? Gentoo is the only major distro that still supports the PPC64 architecture. And they have CD ISO files. My CD media is still functional, and I was able to boot the CD and run through the Gentoo install process without issue. The problems came with Yaboot, the default bootloader suggested in the Gentoo Handbook install instructions. I went through two lengthy Gentoo installs which ultimately failed when trying to boot with Yaboot. As I mentioned previously, Open Firmware is the most absolute worst piece of software to work with. Pairing Open Firmware with Yaboot summons the devil. The Yaboot man page describes Open Firmware as "disgusting." I couldn't agree more.

I ran through a couple Gentoo installs and a million tweaks to Yaboot trying to get the G5 to boot into the new Gentoo install. No luck. Turning to the forums, I learned that the Yaboot instructions in the Gentoo Handbook are incorrect because Gentoo's implementation of Yaboot doesn't currently work with the G5. This has been the case for at least a year. Great.

Rather than simply let it go, I decided I would try to help the next person trying to install Gentoo on a G5 and edit the Gentoo Handbook to note that this would be an impossible task and to save time by avoiding Gentoo like the plague. So I went to try and make an edit to the Handbook. I figured I wouldn't have permission to straight-up edit, but could at least start a discussion so someone with authority could add a note. (It is a Wiki after all!) Creating an account was a non-starter. Then my frustration got the best of me and I decided to post to the Gentoo forum and let people know that the Gentoo Wiki signup process was prohibitive and that a note needed to be added to the Handbook to let people know that installing Gentoo on a G5 using Yaboot is broken. That did not go well. As of this writing, the Handbook has not changed and some poor soul is wasting hours on end trying to get Gentoo installed on a PowerMac G5.

What's that? The discussion tab on the Handbook suggests using Grub instead of Yaboot? Fool me 100 times, shame on me. I ran through two or three more Gentoo installs, this time trying to boot with Grub. No luck. Once again, I am done with Gentoo.

At this point I had visions of going back to whatever latest version of OS X would run on the G5 and then run a virtualized Linux install. Turns out virtualization really wasn't a thing people did back when the G5 was in its prime. There was a lot of emulation of x86 for Windows and Linux, but no apps that supported virtualization. (QEMU was a thing, but the only examples I found were already running on a PPC Linux install, not OS X.)

More Googling, more searching. I kept seeing results from YouTube talking about "Installing MODERN DEBIAN on a Power Mac G5!" and just kept ignoring them because who watches videos that should be a blog post. So I finally watched the video (at 1.5 speed, of course). Dude really had this one weird trick to get Debian Sid running on this old PowerMac G5? Yes, yes he did. Somehow, downloading the "unofficial" Debian 12 PPC64 ISO, burning it to CD, and installing the OS just worked. I can't believe such a thing even exists, but it does, and it's amazing. Modern magic.

So huge thanks to John Paul Adrian Glaubitz, a volunteer Debian developer that still regularly cranks out PowerPC install images and manages the Debian PowerPC mailing list. Crazy that it all comes down to one person keeping all this old hardware useful in the modern age.

Introducing NetFinder - The Definitive Ham Radio Net Directory

19 April 2023 at 00:00

Introducing NetFinder - The Definitive Ham Radio Net Directory

April 18, 2023

NetFinder is an online directory of Ham Radio nets designed to help Hams discover new nets and connect with other Amateur Radio operators.

A comprehensive and easy to use directory of Ham Radio nets is a project that I've thought about for years and recently spent the past 18 months building. This project is intended to fill what I see as a gaping hole in the hobby: A central place to easily find a net and get on the air.

A screenshot of NetFinder displaying a list of nets utilizing EchoLink.

NetFinder accomplishes this goal through the following tools:

  • Simple search by name: If you know the name of a net but need more information, simply search by name and results appear instantly.
  • Browse nets by mode: Only interested in DMR or CW nets? Nets are grouped by a variety of categories for easy browsing.
  • Nets happening now: NetFinder lists nets in progress during the current hour and nets coming up in the next hour.
  • Search by location: (Coming soon!) If you're out of town and looking for a net to join on a local repeater or curious about which nets operate in your area, this tool will provide the answer.
  • Community sourced: NetFinder is driven by the Ham community meaning nets can be added and edited by anyone.

Why NetFinder?

Aren't there other ways to find net information online? Yes, but they each fall short.

Several months ago I conducted a survey asking Hams what tools they use to discover new nets and the answers were all over the place. The survey verified the lack of a central source for good net information. NetFinder attempts to consolidate several use cases on to a single platform.

The ARRL has a net directory, but the user experience is poor and there's not an easy way to find local nets. NetLogger is a great site for finding nets on the air right now, but only surfaces nets using their net management software. Search engines like Google are hit or miss. There are many spreadsheets floating around the internet that attempt to provide a comprehensive list, but those aren't scalable and are a poor experience as well.

A screenshot of NetFinder displaying information for the Nine O'Clock Net out of Seattle, Washington.

Think of NetFinder as the RepeaterBook for nets.

With that said, I invite you to take a look around and see what you think. Feeback is welcome. It is early days and a lot more is still planned for this project. The number of nets in the directory is still small, but it's enough to show the potential of this tool. I hope you'll join me in cataloging the nets of the world, and helping other Hams find ways to connect on the air.

β€”K4HCK

Hacking the University Mac Lab

9 March 2023 at 00:00

Hacking the University Mac Lab

March 08, 2023

Everything is a hack, from the rendering engine displaying this text to the onboard software that ran on Apollo. It's all expertly crafted, but the fundamental nature of computers force the potential for bugs and critical errors. No software is full-proof.

This is a short story about hacking the University of Tennessee library computer labs around the year 2002. It's not a sinister, bring the network to its knees type hack. Nothing was broken. This hack involved font files, an FTP client, and a lab of Macintosh computers running OS 9 that were tightly locked down to prevent users from accessing system level settings and files.

At UT I was a graphic design major which meant I was producing lots of graphic design projects using apps like Adobe Photoshop, Adobe Illustrator, and Quark Express. Like most design schools at the time, UT was a Mac shop. The design program had its own small lab of about a dozen Macintosh computers, a large format printer, and an HP Scanjet 4c1. I was also fortunate to have my own Mac which I purchased by saving from the part time jobs I had in high school. It was a 233MHz beige G3 tower. A beast of a machine at the time!

Working on various graphic design projects meant hours in front of the computer getting every detail just right. Cutting out images in Photoshop, creating logos and vector art in Illustrator, and pouring over typographic details in Quark were the standard tasks. Unfortunately, lab space was hard to come by. The design lab was almost always full. It also had weird hours, open late to all hours one day, and closed the next. Weekends were equally unpredictable. I had my own Mac in my dorm room, but living in The Zoo2, it was constant chaos and distraction. Not a place to concentrate on work.

With access to quality workstations lacking, it often meant packing up the Zip Disks we used to haul our work around, and seeking out other labs on campus. However, quality Mac labs were few and far between. Most of the dorms had Mac labs, but the machines were terribly dated and didn't have the Adobe suite of apps installed. The only other viable option was the Mac lab at Hodges Library. This was the main library which meant a huge computer lab with both Mac and PC workstations. This lab was also almost always open except for maybe Sunday mornings. The machines were new, and they had just about every relevant piece of software installed, including the Adobe suite of apps. However, there was one major problem. The Macs in Hodges were locked down with software that limited user access.

Specifically, access to the System Folder which housed the Fonts folder, was unavailable. This was problematic because most design projects were using fonts beyond the system defaults. That meant we had to install any custom fonts onto the machine we were working on and that was done by dragging font files from our Zip Disks into the System Fonts folder. (In addition to the design files on our Zip Disks, there was always a large Fonts folder full of hundreds of fonts that design students would pass along to one another. Adobe's network-enabled font offering was still years away.) Without the ability to access the Fonts folder and install fonts, it would be impossible to work on a project with any typographic component.

As a result, I'd carefully plan out which projects should be worked on at any given lab. If I had a ton of image manipulation to do, I could jump over to Hodges with no problem. If I had a lot of typographic work in a project, I'd have to do that in the design lab or on my computer. Dealing with the locked down Macs in Hodges was a hassle. Most of the other Macs on campus were completely wide open. No restrictions. This was a time before every student had some form of single sign-on account that was required to gain access to any machine on campus and beyond. In the late 90s and early 2000s, anyone could walk into just about any lab, sit down, and have completely open access to any machine and the university network at large. The school had virtually no concept of who was using what in a public lab.

Of course, that meant I could poke around those locked down machines in the Hodges lab with little concern or trace and try to find a way around the protections that were locking down the Fonts folder. I needed the freedom to do my design work where I please, with whichever fonts I please!

I don't remember the details around how I came to the eventual solution, but the hack ended up utilizing FTP server and client software. I spun up an FTP server on the Mac, probably using an app like FTPShare3, then, on the same machine, used Fetch4 to connect to the FTP server. At that point, I had "root" access to the entire file system on the Mac, bypassing the software in place that removed access to the Fonts folder through Finder. I could simply drag and drop the fonts from my Zip Disk into Fetch, and Fetch "uploaded" the fonts to the Fonts folder. Problem solved! I could work on any project I wanted using any font I wanted in any lab I wanted.

While this wasn't a complex hack, it illustrates the limited nature of software and inability to cover all use cases. The software locking down the lab computers solved the problem for 99.9% of use cases. Users couldn't get into system folders through the Finder. It didn't take into account accessing the file system outside of Finder.

1 I don't know why I remember this scanner in such detail, but at the time it was INCREDIBLE to be able to scan images into Photoshop at 600 DPI. It was a real game-changer for design projects, taking visual exploration to new heights. This video is a long, but very satisfying overview of the Scanjet 4c.

2 At the time, Hess Hall at the University of Tennessee was nick-named The Zoo because it was constant chaos. This was likely a result of no air conditioning which required the windows to be open 24/7 and every sound coming from that building somehow amplified itself across campus. It was also the largest dorm on campus featuring two wings and a courtyard surrounded by the U shape of the building. (I've seen things in that building that I will never unsee.)

3 Mac OS 9 had a built-in FTP server that could be enabled, but the software restricting user access also removed access to network settings.

4 Before Panic's Transmit, Fetch was the FTP program for the Mac. It's what you would expect from a piece of Mac software. Simple, easy, and a joy to use.

The Hodges Library computer lab featuring a PowerMac G4 Quicksilver.

This is it! The University of Tennessee Hodges Library Mac computer lab featuring a PowerMac G4 Quicksilver tower running Mac OS 9! Photo taken January 28th, 2002 around 10:30 PM.

Look at the terrible wooden chair in the background. That's what we sat on for hours on end. And we liked it!

Survey Results: Ham Radio Nets

3 January 2023 at 00:00

Survey Results: Ham Radio Nets

January 02, 2023

A few weeks ago I shared a survey on social media that focused on ham radio nets. This survey was an attempt to understand how people seek out information about nets and how they discover new ones. This post will reveal some of the highlights from the 35 responses that were collected from that survey.

First of all, thanks to everyone that participated! I'm a product manager by day, so this type of qualitative research is one of my favorite aspects of the job. The insights are always enlightening. So without further delay, let's dig in.

Survey Question 1

The first question intended to uncover how people are learning about new nets. The most repeated answers included

  • Spinning the dial and simply listening to the radio
  • Word of mouth from other hams at social gatherings (club meetings)
  • Using NetLogger
  • The Internet: Newsletters, websites, Google Search, email groups, social media, online club calendar, forums

This question revealed three notable insights.

First, there's not one definitive source for discovering new nets. The methods and sources vary wildly and no single method stood out from the rest.

Second, a surprising insight was the use of NetLogger. I had never used NetLogger so I download the app which quickly revealed itself as a legitimate net discovery tool. The "Select Net" button turns out to be a live listing of active nets using NetLogger and there are plenty to choose from. Sadly, the website hides this very helpful feature. (Aside from the name, the website tells nothing of what the app does.)

Third, a lot of discovery is happening in the Internet space, but again, there's no one source that people gravitate to.

Survey Question 2

The second question focused on the Internet and ultimately sought to reveal websites dedicated to net cataloging and discovery. Common responses included

Similar insights arose from this question as with the previous question, the most notable being the lack of a dedicated source for hams to discover nets. The ARRL Net Directory was listed twice, but clearly this tool isn't a priority for the ARRL based on the poor experience and limited data set.

Survey Question 3

The third question attempted to uncover why people are searching for nets. The responses varied from boredom to looking for more info after hearing about a net on the air. No notable insights were surfaced from this question.

Survey Question 4

The fourth and final question simply asked what type of nets people are interested in and this garnered the most varied response yet. Name the net type, and someone was looking for it:

  • HF nets
  • Wires-X nets
  • ARES nets
  • Emcom nets
  • Casual nets
  • Nets on repeaters
  • Traffic nets
  • Club nets
  • Rag chew nets
  • Fun nets
  • Educational nets
  • DXing
  • SKYWARN

There are a lot of nets out there!

To summarize, the big takeaway from the survey is the lack of a centralized, definitive source for cataloging and discovering amateur radio nets. There's no Repeater Book of ham radio nets. No single source to see which DMR or 20m SSB nets are happening Thursday night at 9:00 when I get off work. No way to look up a city and find out what nets are taking place on the local repeaters. It should be easier to find a net and connect with other hams.

Does anyone else see an opportunity lurking?

Apple Watch Owners: How do you rely on this thing?

4 August 2022 at 00:00

Apple Watch Owners: How do you rely on this thing?

August 03, 2022

After wearing an Apple Watch nearly 24/7 for the last 6 months I switched back to my Fitbit because I kept missing important text messages.

It took me a few months to realize this, but Apple approaches smartwatch design completely differently than the other smartwatch brands. While Fitbit and others treat the smartwatch as a supplement to the smartphone, Apple treats the smartwatch as a replacement to the smartphone.

This approach manifested itself most critically in the way Apple Watch handles notifications. Prior to Apple Watch, I relied very heavily on the visual notification cue of the phone, relying on the screen illuminating when a new text message arrives. For example, I always keep my phone in view when I’m working at my desk which provides an excellent cue that a notification has arrived, even with the phone silenced.

When notifications are enabled on Apple Watch, iPhone no longer illuminates the screen and no longer plays an audio cue when a text message arrives. The user is forced to rely solely on the watch.

This became problematic in instances when relying on text messaging to know that food was ready from a food truck, or a table was ready at a restaurant. In these noisy environments my iPhone sits on the bar so that I can visually see that message come in. Without the visual cue from the phone I started missing these messages because I couldn’t hear the watch, didn’t feel the watch vibrate, and didn’t see it illuminate. Of course, Apple doesn’t provide any options to get the full notification experience on both devices. It’s one or the other.

Eventually I started missing important messages from my wife simply by sitting at my desk. There seems to be some logic where the watch determines the user is in some sort of focus mode (is it listening for keyboard activity?) and doesn’t give a haptic or audio cue that a message has arrived. Maybe I’m that focused on my work and simply oblivious, but I’d like to think I know myself better than that. The final straw was a missed message from Merredith related to car trouble.

The other downside to Apple Watch taking over notification duty is the lack of variety in notification sounds. When relying on the iPhone, I knew what types of notifications I was getting simply based on sound. Apple Watch only has a single sound for all notifications. No customization in that space either. A step backward from the iPhone experience.

Notifications aside, from a health tracking perspective I was getting way less movement with Apple Watch. Fitbit prompts the user to achieve 250 steps an hour. That got me away from my desk on a regular basis which resulted in more steps and much needed mental breaks. Apple Watch prompts the user to stand, but doesn't require a step count. I could stand at my desk for 1-2 minutes and I met the goal. Apple Watch also doesn't highlight step count in any way. They leave that up to 3rd party apps. Light fitness tracking and movement is definitely secondary for Apple.

Fitbit isn't all roses. While I can go 4 days without a charge (vs. charging every other day with Apple Watch), the app ecosystem is non-existant. In the 6 months I was away, Fitbit appeard to only release one new watch face. They rely much more heavily on the development community to provide watch faces and their standards are low. Most faces are visual monstrosities that require additional purchases.

These are downsides I'm willing to live with if it means no longer missing important messages.

Yoshi Sodeoka Rides the Psychedelic Wave

30 May 2022 at 00:00

Yoshi Sodeoka Rides the Psychedelic Wave

May 29, 2022

This is a research and writing sample from Art History 701 at SCAD. The assignment was to write a short research paper about a contemporary artist of my choosing.

Yoshi Sodeoka Rides the Psychedelic Wave

Contemporary artist Yoshi Sodeoka was an early pioneer in combining art and technology in the emerging internet age and continues to influence digital art and popular culture 30 years later. Art and technology have had a symbiotic relationship since the first pigments were applied to the Lascaux Caves in France, and Sodeoka is only one artist in an extensive line of artists and movements that consider the melding of art and technology as a means toward a better existence. However, his consistent body of tech-infused work stands above, shaping political discourse, popular music, as well as the technology used to buy and sell art itself.

Yoshi Sodeoka was born in Japan and later moved to the United States in 1989 to study art and design at Pratt Institute in New York City.1 Since moving to New York, Sodeoka’s career has spanned three decades combining art and cutting-edge technology to develop a style often described as psychedelic art. His success includes exhibitions at the Museum of Modern Art, The Whitney Museum, Tate Britain, and many others.2

Sodeoka was one of the earliest artists to integrate internet era technology in his art. For one of his first projects, he served in the role of Art Director in 1996 for Word Magazine, a webzine considered one of the first of its kind. Word Magazine included some of the first examples of pixel art on the internet. At the time, publishing daily art content on the internet and building a community around it was ground-breaking. In those early years, Sodeoka would begin experimenting with emerging technologies like DVD, Enhanced CD, and Macromedia Director. Eventually he turned to video, mashing up analog VHS signals with the personal computer, often moving images and signals back and forth between mediums to create a unique style.

Sodeoka’s desire to combine technology and art was not unlike the work of Experiments in Art and Technology (EAT), a collective of artists in the 1970s that combined multiple mediums with new forms of technology to create unique experiences. This comparison links Sodeoka with the likes of Robert Rauschenberg and Robert Whitman, two titans of the American art world. While the technology (wireless audio, sonar, and space age mirrors to name a few) EAT adopted was earlier and different from what Sodeoka would eventually adopt, the goals were similar: to enact change in the world using art and technology.

In the coming decades, Sodeoka would do exactly that by forming a multitude of experimental art collectives including Undervolt & Co and C404. Through these groups, Sodeoka would drive political discussion, shape what popular music looks like, and invent entirely unique motion graphics processes in After Effects along the way.

In 2004 Yoshi Sodeoka produced one of his pinnacle political art pieces, a digitally manipulated video of George H. W. Bush’s 1991 State of the Union Address titled ASCII BUSH (Figure 1). The video is rendered in American Standard Code for Information Interchange (ASCII) in which the subject matter appears as a constantly changing digital array of characters and symbols. The audio is digitized to make the voices sound robotic, and the cheering crowd sound like white noise. The result produces a very grating, flat, and programmatic feel to the topic of war and politics. The stateliness, patriotism, and presidential feel are erased, replaced with a cold, cruel, and unwavering drone of unfamiliar visuals and audio. The viewer is forced to experience the political message through a completely different lens.

More recently, Sodeoka was commissioned by the psychedelic music project Tame Impala to produce a music video for a track titled Elephant (Figure 2). With a background in music, Sodeoka tries to merge visual and sound together as a single entity, often visualizing color and shape in his mind before ever producing a single visual element.3 The process produces his signature psychedelic motion graphics style, a blend of analog and digital signals interacting, shifting in shape and color, and creating a sometimes-dizzying barrage of visual intensity. Elephant is no exception. The music video rolls, flashes, bends, duplicates, and explodes with color. The shifting visuals often keep pace and react with the underlying beat of the music to create a synthesis of audio and visual perfection. Sodeoka’s intense visual style pairs well with the repetitious, electronic, and distorted sounds of Tame Impala.

To achieve his unique visual style, After Effects has emerged as Sodeoka’s tool of choice. While After Effects is a common tool, his results are often in a class of their own. Going beyond simply running video through a series of effects, Sodeoka has developed his own entirely unique process by utilizing After Effects’ programmatic scripting engine Expression. Through programming, visuals can take on computational and randomized forms, producing unlimited unique outcomes that are often difficult to label as a specific category of art.

As technology has advanced throughout his three-decade career, Sodeoka has continued to embrace the latest tech-driven tools and concepts in his work. This is true today with Sodeoka’s adoption of cryptocurrency, specifically non-fungible tokens (NFTs), in many of his latest pieces. Whether cryptocurrency and NFTs are simply a marketing fad or here to stay remains to be seen, but Sodeoka has effectively harnessed their purpose in the current moment. His NFT art closely mimics his other pieces by using the same After Effects method to produce wild, abstract, and constantly shifting blobs of pixels that mesmerize the viewer in a manner appropriate for the current era of continuous consumption of media.

In addition to selling several of his own pieces as NFTs on platforms like Foundation, Sodeoka has been commissioned to collaborate with major brands such as Forbes to incorporate NFTs into fundraising campaigns. In the piece titled Merchants of Metaverse (Figure 3), Sodeoka collaborated with photographer Michael Prince for Forbes to create a digital version of a special edition magazine cover featuring the Winklevoss Twins of Silicon Valley fame.4 In this piece, Sodeoka repurposes a traditionally designed magazine cover and applies his trademark visual style of hybrid analog and digital glitches to give the cover an animated presence that is impossible to achieve in traditional print form. This is a notable example of multiple forms of technology coming together to create new experiences. Older print technology merges with more modern forms of graphic design which then merges with innovative After Effects and NFT processes creating completely new forms of art. Typography slices and splits, images break into blocky sets of pixels, and the entire cover begins to transform, revealing its digital essence. The visuals are fitting, referencing the subject matter of the cover: The Metaverse, Silicon Valley, and the investors driving it. In the end, the NFT auction raised $333,333 benefiting the Committee to Protect Journalists and the International Women’s Media Foundation.5

Yoshi Sodeoka’s body of work spans an impressive 30 years of incorporating the latest technology in his art. In a time in which the world experiences changes at a blindingly fast rate, Sodeoka’s art manages to keep pace. His ability to embrace change should be noted not only in the space of technology, but in the way that embrace affords him the ability to continue to be a part of popular culture and shape the way culture is defined visually. Whether through music, politics, or publishing, Sodeoka continues to challenge his viewers. With an impressive variety of work already behind him, what will the next 30 years of his technology infused art look like?

Yoshi Sodeoka, ASCII Bush, 2004

Fig 1. Yoshi Sodeoka, ASCII Bush, 2004, Digitally Filtered Audio and VIdeo, https://sodeoka.com/ASCII-BUSH

Yoshi Sodeoka, Elephant, 2012

Fig 2. Yoshi Sodeoka, Elephant, 2012, After Effects Rendered Music Video, https://sodeoka.com/Elephant-Tame-Impala

Yoshi Sodeoka, Merchants of the Metaverse, 2021

Fig 3. Yoshi Sodeoka, Merchants of the Metaverse, 2021, Mixed Digital Media, https://cryptodaily.co.uk/2021/04/Forbes-Mints-Cover-As-Its-First-NFT-Merchants-of-the-Metaverse

Endnotes

1 Naoko Fukushi, ”Yoshi Sodeoka,” Shift Japan, October 4, 2003, http://www.shift.jp.org/en/archives/2003/10/yoshi_sodeoka.html

2 Jyni Ong, β€œDigital artist Yoshi Sodeoka’s Work Cannot Be Categorised,” It’s Nice That, September 9, 2015, https://www.itsnicethat.com/articles/yoshi-sodeoka-the-new-york-times-digital-050919

3 Kate Neave, β€œIn Digital: Behind the Screen with Yoshi Sodeoka,” OpenLab, accessed May 19, 2022, https://openlab.fm/news/in-digital-behind-the-screen-with-yoshi-sodeoka

4 Forbes Press Release, β€œIn A First For The Publishing Industry, Forbes Transforms Latest Cover, Featuring Cameron And Tyler Winklevoss Into NFT Contemporary Art,” April 5, 2021, https://www.forbes.com/sites/forbespr/2021/04/05/in-a-first-for-the-publishing-industry-forbes-transforms-latest-cover-featuring-cameron-and-tyler-winklevoss-into-nft-contemporary-art/?sh=53d0d7825dcf

5 Vince Dioquino, β€œForbes Mints Cover As Its First NFT: β€˜Merchants of the Metaverse’,” CryptoDaily, April 9, 2021, https://cryptodaily.co.uk/2021/04/Forbes-Mints-Cover-As-Its-First-NFT-Merchants-of-the-Metaverse

Bibliography

Dioquino, Vince. β€œForbes Mints Cover As Its First NFT: β€˜Merchants of the Metaverse’.” CryptoDaily, April 9, 2021. https://cryptodaily.co.uk/2021/04/Forbes-Mints-Cover-As-Its-First-NFT-Merchants-of- the-Metaverse.

Forbes Press Release. β€œIn A First For The Publishing Industry, Forbes Transforms Latest Cover, Featuring Cameron And Tyler Winklevoss Into NFT Contemporary Art.” April 5, 2021. https://www.forbes.com /sites/forbespr/2021/04/05/in-a-first-for-the-publishing-industry-forbes-transforms-latest-cover- featuring-cameron-and-tyler-winklevoss-into-nft-contemporary-art/?sh=53d0d7825dcf.

Fukushi, Naoko. ”Yoshi Sodeoka.” Shift Japan. October 4, 2003. http://www.shift.jp.org/en/archives /2003/10/yoshi_sodeoka.html.

Khaikin, Lital. β€œYoshi Sodeoka Video Artist Interview: Psychedelic Apocalypse in the Digital Realm.” Redefine Magazine. November 6, 2019. https://redefinemag.net/2014/yoshi-sodeoka-video-artist- interview-psychedelic-digital/.

Neave, Kate. β€œIn Digital: Behind the Screen with Yoshi Sodeoka.” OpenLab. accessed May 19, 2022. https://openlab.fm/news/in-digital-behind-the-screen-with-yoshi-sodeoka.

Ong, Jyni. β€œDigital artist Yoshi Sodeoka’s work cannot be categorised.” It’s Nice That. September 9, 2015. https://www.itsnicethat.com/articles/yoshi-sodeoka-the-new-york-times-digital-050919.

Schumacher, Sebastian. β€œAll You Can E.A.T. The 1970 Pepsi Pavilion in Osaka.” Uncube. July 14, 2019. https://www.uncubemagazine.com/blog/13753251.

Smith, Greg J. β€œInterview with Yoshi Sodeoka: Infinite Cycles.” Sedition. June 2, 2019. https://www.seditionart.com/magazine/interview-with-yoshi-sodeoka-infinite-cycles.

Wasting an Evening with distcc

26 April 2022 at 00:00

Wasting an Evening with distcc

April 25, 2022

I need to stop going down these Gentoo rabbit holes. Recently I picked up an old PowerBook G3, restored it to working order, and opposite of my better judgement immediately installed and compiled Gentoo. Got it working. All was well.

Unfortunately, because Gentoo builds all updates from source, it can take 3-4 days just to compile and update a bare-bones install running i3 and a web browser. I've known about distcc for a long time. Distcc allows other machines on the network to compile binaries. This sounded like the perfect solution! I could offload the compile time from the old 400mHz G3 and blaze through updates in just a few minutes utilizing a modern machine. I finally decided to give it a shot.

Got everything installed and configured on the PowerBook, the machine that would be doing all the heavy lifting (Gentoo in a VM on an AMD Ryzen 7), and for good measure and testing purposes, an old Dell Mini 9 also running Gentoo. This portion wasn't without its hiccups.

Tests worked well when running updates on the Dell. The Ryzen machine picked up the work from the old laptop and powered through. Not the same results when updating the PowerBook. Turns out cross compiling different architectures requires more work involving crossdev. Got crossdev installed, but ran into errors when building out the PowerPC toolchain. Something around gcc "not supported by your assembler." Tried different versions of gcc, but to no avail. At that point, things were beyond my knowledge base. (Some weird issue with keyboard input hanging when working with the VirtalBox VM didn't help.)

Cross compiling with distcc didn't work this time around. But it got me thinking about Gentoo in general. Gentoo used to be my favorite distro. It's a very educational distro in that it forces you to learn through their lower-level setup process. And it's one of the few distros that still supports older architectures such as 32-bit x86 and PowerPC. Just about every other distro has moved on to amd64 and ARM exclusively. The number of roadblocks when working with Gentoo day-to-day just feels much higher than it used to be. Emerge updates routinely result in conflicting packages and circular dependencies. Emerge --sync has been plagued with failed manifest validations requiring continuous mirror changes. Documentation is out of date.

I know it's pointless to try and keep this old hardware running, much less ask it to run the latest OS technology. But solving those puzzles are fun. Unfortunately, the puzzles seem to be getting more and more complex.

My First Experience with the Internet

9 December 2021 at 00:00

My First Experience with the Internet

December 08, 2021

I first stumbled on to the Internet in August of 1994. Until that point, I had been online in one form or another, whether that was playing games dialed into The Sierra Network or dialing into local BBSs. In 1994 there really wasn't a direct connection to the Internet available at home. You had to tie up your phone line to dial into a separate network or BBS which then happened to be connected to the Internet. The Internet was simply a sub feature of closed online services.

In my case, I was at home on the family Packard Bell PC packing either a 386 or maybe by that time a 486 CPU which powered a phone modem with blistering speeds somewhere between 1200 and 9600 bps. We either subscribed to or regularly picked up copies of Computer Shopper, a massive magazine at the time filled with all sorts of computer related articles, ads, and directories. Specifically, Computer Shopper contained pages and pages of BBS listings. BBSs were listed numerically by area code, so you'd scan through looking for your city's area code and locate your local BBSs. Calling phone numbers with area codes different from your phone number's area code cost extra, charged per minute. So you couldn't dial just any random BBS. It had to be in your city.

We lived in Miami at the time, which meant I didn't have to scan too deep into the listings to find the 305 area code. On this particular day in August I found a different type of BBS that was hosted by the Miami Dade County Public Library System. They called it the Miami FreeNet. I wish I could remember more about FreeNet. I've never been able to find any information about it online. I assume the name indicated that it was an open BBS (some were private and cost money to gain access). Whatever it was, it was local, free, part of the library system, and that was enough for me to dial in and start exploring.

I don't remember how I stumbled onto the Internet. Like every other BBS this one was text based. No graphical interface, no images on the screen. Just text and more text. It was the equivalent of a terminal window. I tabbed over to one menu item or another and perhaps ended up in a Lynx instance? Regardless of the web browser I was using, I know I ended up on yahoo.com. Difficult to forget a service named Yahoo! I know this took place in August of 1994 because from Yahoo! I ended up on the Woodstock 94 website. The Woodstock website was such an interesting experience because they were uploading digital images during the festival. That meant I could be at home experiencing the festival by downloading the same images in near real time. Of course the images weren't displayed inline. Each image had to be downloaded individually, waiting half a minute or more for each to transfer at 9600 bps. Once downloaded, the images could be opened in Microsoft Paint and taken in in all their glory. It really was a miraculous process, images transferring hundreds of miles across telephone lines. Apparently, Woodstock 94 was one of the first events to offer that type of online experience. Sadly, only partial descriptions of the "net-connected" portion of the event seem to remain online.

And that was it. That simple experience of "instantly" viewing images of an event happening across the country was enough to want to explore more. This experience was unlike anything else and I knew it was the future.

A year later I'd be dialing into the Internet more directly, this time living in Nashville, connected to the Nashville Computer Solutions Network (NCSN) and surfing away with Netscape and a fully graphical web experience. A year after that I'd be building my first website for money at 16 years old.

Twitter on a 2nd Generation Kindle

26 May 2021 at 00:00

Twitter on a 2nd Generation Kindle

May 25, 2021

I recently created a Twitter appliance using an old second generation Kindle.

The allure of vintage computing has been strong so I figured what better device to play with than an older second generation Kindle. The best feature of this old e-reader is the built-in "free forever" 3G internet access that came with the device. The purpose of the free internet connection was to purchase and download e-books from any location. A relatively low bandwidth and presumably low cost feature that Amazon could easily afford from all the ensuing e-book purchases.

Second Generation Amazon Kindle with Twitter

Amazon also included an "experimental" web browser. At the time, just about every website was accessible in all the black and white glory an e-ink screen could provide. Catching up on news and social media was possible. Today, not so much. When this Kindle was first released https wasn't a thing outside of checkout flows. Now, with the ubiquity of https, this Kindle's experimental web browser can display almost zero modern websites because the browser doesn't support https.

Enter the lo-fi web. Sites like 68k.news and the FrogFind! search engine breathe new life into an old browser. With those sites as inspiration, I whipped up a bare bones version of my Twitter feed and pointed the Kindle's web browser right at it. Success! It's nothing more than using the Twitter API to surface the latest Tweets and their associated user avatars. No interaction other than paging through the results and hitting refresh for more.

Initially I tried to include image media but doing so quickly exhausted the limited amount of available RAM and crashed the browser.

The result is the modern web experienced through a slow, low resolution computing lens. It may not be much, but it's fun to revitalize this old hardware. And it's tough to beat a free connection to the world wide web.

Moving from Hexo to 11ty

30 March 2021 at 00:00

Moving from Hexo to 11ty

March 29, 2021

This blog is now generated using Eleventy! I recently moved everything over from Hexo for a multitude of reasons. Primarily, the template I was using was bloated with unnecessary logic, markup, and various JS libraries. I didn't like the idea of using a template when I could instead build and design something from scratch. It didn't feel right not having a custom approach. In addition, Hexo had a number of outstanding bugs with caching that was making development nearly impossible.

This isn't a how-to on moving away from Hexo, but a number of sources helped me along the way. CSS layout is based very closely on Josh Comaeu's Full-Bleed Layout Using CSS Grid. This is the first time I've incorporated CSS grid on this site and I was surprised to see how simplified the layout process has become. Utilizing this example took minutes as opposed to hours using other methods. Eleventy setup was sourced heavily from Alex Pearce, especially getting the date based URL structure right. In addition, I had a handful of excellent examples from Trey, Josh, and Alex. Trey has a great 11ty starter project with Sass and Netlify support.

I've enjoyed a more manual approach with this project. The CSS is so minimal, I decided against an extension, though I love Sass for large projects. Same with markup. I'm writing HTML elements manually as opposed to using Markdown. Markdown has its place, I just really enjoy writing HTML elements "by hand."

11ty is the 6th tool I've used to publish this site since 2001. It started with Blogger, then Greymatter, moved to WordPress, then Octopress, Hexo, and now Eleventy. This tool feels the most manageable and flexible so far and I hope to stick with it for a long time. Or at least until the next shiny new tool starts trending. This project is viewable on GitHub.

Open Firmware DEFAULT CATCH! code=300

6 October 2020 at 20:31

Open Firmware DEFAULT CATCH! code=300

October 06, 2020

There are a number of forum posts that focus on a firmware error thrown by older Macs, from PowerBook G3 laptops to some of the early iMacs and G3 towers. All of the posts suggest zapping PRAM and various other software solutions that don't seem to solve the majority of Macs running into the problem.

I recently aquired an old Apple PowerBook G3. It booted up just fine the first time, but every subsequent boot resulted in the DEFAULT CATCH! code=300 Open Firmware error. I tried booting from a number of bootable CD-Rs (Ubuntu, OS X, OS 9) but that also resulted in the dreaded code=300. The OS X CD wouldn't boot at all. But I noticed the OS 9 CD would get so far before throwing one of those full, freeze the entire OS bomb errors. A common problem due to OS 9's lack of protected memory. Then I noticed the Ubuntu CD would freeze on boot at the RAM disk stage every single time. These were two strong suggestions that the RAM chips might be bad.

Under the PowerBook's keyboard was easy access to the two RAM sticks. I removed one and the system booted completely every time. Swapping the sticks resulted in the code=300 error once more.

If you find yourself working with an old Mac and run into an Open Firmware DEFAULT CATCH! error, there's a good chance your RAM is bad.

What's the goal? Amateur Radio license changes

2 August 2020 at 09:18

What's the goal? Amateur Radio license changes

August 02, 2020

The ARRL appears to be embarking on a campaign to provide more spectrum access to Technician Class Hams. (It's impossible to find any mention of this on their website, but that's a different blog post.) This is the most significant and potentially game changing task the ARRL can take on as the biggest lobbying force for the hobby.

What's the goal?

Before getting into whether or not this draft plan is good or bad for the hobby, we need to ask what the goal of all this is. I assume the ARRL has communicated their goal somewhere, but again, impossible to find on their website. Do we want more people to join the hobby (growth)? Do we want more existing Hams to participate within the current framework (retention)? Do we want to maintain the status quo?

What do we want the future of the hobby to look like? Do we want to see real increases in participation and diversity, beyond the stagnant 1% annual growth rate? Or do we want to continue on the current path of a small, homogeneous, yet tight-knit community that we see today?

The current barrier to entry

The current license class system is an arbitrary barrier to entry. The chasm of privileges between Technician and General Class licenses is the biggest road block to both entry and regular participation in the hobby, and this is coming from personal experience. Halfway through my Technician class in February of 2012, I realized that the Tech license wasn't going to be sufficient to allow me to play around with digital modes. My entire goal of going through the Tech license class was to play around with JT65 on 20 meters. The Tech license wasn't going to get me there, and a General license class wouldn't be offered for another year. To solve this, I promptly started memorizing General pool test questions on QRZ so that I could pass both the Tech and General tests on the same day.

There are a lot of problems to unpack from that experience. License classes and test sessions, in general, are slow to come by. A huge problem in today's world of instant gratification and attracting younger members (I was 32 at the time). I missed out on a proper education around General operating technique and precautions because I simply memorized the test. On the surface, you could argue it's dangerous not to have the proper education, but I think what it really says is there's not a good reason to have General privileges walled off behind another license class and test. (I have neither blown up my rig, or been contacted by the FCC due to non-compliance of the rules.) The upgrade system is broken, either way. I don't believe most people would do what I did. In that scenario, I believe most people would give up on the hobby if they learned halfway through the Tech class that they would only gain access to a sliver of what's possible with a General license.

Big changes equal big results

The only way to both cultivate and increase the size of the Ham community is to widen the entry path. We need the interest and participation from more people or this hobby will literally die off in another 20 years. There's no down side to giving Techs more access to HF. The Tech license itself provides enough checks and balances against the airwaves becoming a Citizens Band situation.

If we want to keep the status quo of a walled-off community with a slow drip of new participants, let's keep doing what we're doing. If we want younger voices with new thought leadership and new technology ideas, and simply more people to talk to on-air, let's give Techs more room to participate.

I know a lot of Hams that picked up their Tech license and promptly put it in a drawer never to be picked up again. I bet a few more privileges would encourage them to take a second look.

For more content like this, subscribe to Amateur Radio Weekly, a weekly summary of the most relevant content in the world of Ham Radio. ♦

First Things First Manifesto

12 May 2020 at 15:44

First Things First Manifesto

May 12, 2020

The First Things First design manifesto was a defining document in the design world in the 1960s, so much so that it's been refreshed over the years (2000, 2014) to represent the challenges of the current age. With Earth Day being celebrated for the 50th year, yet another iteration (F1rst Things F1rst) was written this year.

This latest version lined up so well with a version I wrote last year as part of a SCAD class project that I thought I would post that here in solidarity. A designed poster also accompanied this assignment, which is included at the end.

A Design Manifesto

Designers as individuals are responsible for the unfortunate state of our society.

While previous generations of designers have been concerned with industry's tangible outputs and the belief that those outputs have been what contribute to an empty culture, the core responsibility lies with individuals driving that industry. Individuals embrace questionable behaviors. Individuals drive the design profession toward unethical practices on a daily basis. Through individual designers' actions, the profession as a whole is robbed of focus, wallows in false narratives, and drives excessive consumption. In turn, our society suffers.

I have a lack of focus
I'm too obsessed with the latest thing, the latest process, the latest tool to drive the process. So much precious time is spent thinking around the problem, I never solve it.

I line up to carry dirty water
Big companies do bad things, but I'm always there to rebrand and distract from the core issues.

I perpetuate mindless consumption
Design drives us to buy useless goods in the physical space and collect empty likes in the digital space.

I must do more than point out the problems. I must work toward self-help, pushing back on these questionable norms.

Individual responsibility forms the whole. As one of us changes our actions, others take note. A steadfast persistence guides more of us toward a better path. Eventually, collectively, we create positive change through thoughtful and ethical design choices.

I promise to give myself proper time, space, and research to work through problems and offer solutions. I promise to release myself from cleaning up corporate mistakes, pushing back on companies to own their missteps. I promise to provide users with the experience they deserve, freeing them from the maze of consumption.

Join me in keeping these promises, creating a healthy community of designers, and ultimately, building a healthy world to live in.

Cale Mooth

Manifest poster

Apple Dashboard and how Widgets nearly ruled the world

14 December 2019 at 11:40

Apple Dashboard and how Widgets nearly ruled the world

December 14, 2019

The OS X Dashboard death knell has been tolling for some time, but it was once a premier space for front-end devs to show off.

This post is a look back on Dashboard's heyday but also how close Widgets came to being the obvious springboard to iPhone apps before the release of the App Store.

Steve Jobs and Apple Dashboard at WWDC 2005

Steve Jobs demoing Dashboard at WWDC 2005.

Some history

Dashboard was unique because its little applications (Widgets) weren't heavy Cocoa apps written in Objective-C, but instead HTML, CSS, and Javascript all running in Webkit.

This was a big deal because the barrier to application development became incredibly low. Any front-end developer could build an application that ran on the OS X desktop. It sounds trivial, but the divide between "application developers" and website builders was vast back in the early 2000s. (At least it felt that way for many of us.)

Apple was invested in Widgets. How invested? An entire section of the Apple website was dedicated to Widgets where users could download the little apps. This was never the case for traditional apps. Can you imagine an Apple directory and repository for desktop applications in that time period?

Apple also had a Widget category for the Apple Design Awards in 2006 and 2007.

Apple employees on the Apple Dashboard listserv encouraged Widget developers to enter the awards and so we did! I entered Candor Gallery, a Widget that rotated artwork submitted by a community of artists from around the world.

Candor Gallery Widget

The Candor Gallery Widget.

I had so much fun developing the Widget and the system that allowed artists to send in artwork for consideration for display. I learned basic PHP and MySQL to support and build out this idea. The images and artist info would need to be stored and retrieved and submissions would need to be processed.

Eventually, Candor Gallery was displaying 1,700 unique pieces of art across 4,000 widget views per day. Apple published a count of 40,000 downloads at one point in time. Big numbers back then.

iCreate magazine asked if they could include Candor Gallery on a CD-ROM of software that was sent with every issue. CD-ROMS!

iCreate Magazine CD-ROM spread

Apple singled out Candor as the "Featured Download" on apple.comΒ onΒ multiple occasions. That's incredibly validating.

Candor Gallery featured on apple.com

I even went to WWDC 2006 and was able to witness the Apple Design Awards ceremony. Candor didn't win, but it was exciting nonetheless.

Front-end developers' chance to rule the iPhone

WWDC 2006 was a bit of a dud in terms of announcements (Intel Macs in 2005 and the iPhone in 2007) but it was a fun experience. How many people can say they saw a Steve Jobs keynote in person?

Front-end development dominated several conference tracks and seeing the Webkit team talk about upcoming HTML and CSS support was exciting. Apple was heavily invested in the web and the technologies driving it.

This continued into 2007 after the announcement of the iPhone. Remember, during that first reveal, Steve Jobs was adamant about no native third-party apps running on the iPhone. Third-party apps would all be web based, driven by Webkit.

Web based you say? Looking at iOS and Dashboard, there was a clear relationship brewing. Widget icons even looked like iOS app icons! Widget developers rejoiced!

Porting Widgets over to iOS would be trivial and expose our apps to a much, much larger audience. Apple went so far as to release Dashcode, an application to help drive both Widget and iOS web app development. Front-end web developers were suddenly on equal footing with application developers.

And that high would run for another year until Apple announced the App Store and supplemental SDK to build native, third-party applications for iOS.

Bummer. With the App Store exploding with new apps every day, users turned their focus away from Dashboard and Apple soon did the same.

What could have been

Imagine if Apple had stuck to its guns and kept native third-party apps off its devices. The web might look different. Web standards might have advanced more quickly even across other platforms as Android and Windows Phone could very easily have followed Apple's lead on web based apps.

In the end, Dashboard had a good run and without it I may never have conquered my scripting fears by taking on things like PHP and SQL. In that regard, I owe Apple a lot of gratitude.

Open Graph image CC Nikita Kashner

Product Management: Any Given Day

22 June 2018 at 14:53

Product Management: Any Given Day

June 22, 2018

Product Management is sometimes viewed as a nebulous role. Even among PMs there's both great mystery and interest in each other's process, routine, and general way of doing things.

To shine some light on what a PM actually does from day-to-day, I thought it would be fun to document my events from a single day providing Product Management support for the User and Account Management team on the Emma side of Campaign Monitor.

It turned out to be a good day to document. I had a wide range of activities happening from product release to sessions with Product Marketing as well as interactions with members from other teams across the business.

6:00am Getting the day started

I'm up at 6:00 each day spending two hours cooking a leisurely breakfast, checking-in on the vegetable garden, and tending to other houshold items like making sure the pets are fed.

I try to avoid Slack and email until I'm in the office. I'll glance at the notifications on my phone or look at my calendar to get a sense of what's ahead for the day, but I try to maintain a boundary from starting my work day until I'm in the office.

7:50am - 8:40am Commute

My commute is about 50 minutes in the morning. Podcasts are a must. Seeking Wisdom, Inside Intercom, and Freakonomics are a few staples, among others.

8:45am - 9:00am Coffee & Slack

I grab coffee and have about 15 minutes to catch a glance of what lies ahead for the day before heading into my first meeting at 9:00. Today I'm starting with Slack, continuing a conversation with our head technical writer based in the Australia office. She and I are discussing how we can better manage communication over feature deployments and associated support documentation.

9:00am - 9:30am Meet with an Account Executive

Our AE outlined a few specific use-cases that a prospect is hoping our platform can handle via our API. I'm providing context around the capabilities of our API both making sure the use-cases his prospect is putting forth are possible with our platform, but also helping to empower him with more technical knowledge should the prospect come back with further questions.

9:30am - 11:00am Unscheduled time

Today I'm using this 90 minutes of unscheduled time to respond to email, add a couple tickets to the backlog, and make some last minute additions to supporting documentation for a set of features that we're enabling for all customers at 11:00am.

The pertinent email is a response to one of our 3rd party developers that may be helping us provide some updates to our WordPress plug-in.

The backlog tickets are a continuation of work completed in the previous sprint and in progress in the current sprint. They are the final piece of an effort to move a specific set of admin controls from a legacy section of our app to a new, modern experience. They include sunsetting the old, and communicating that change to customers.

Documentation for our feature release at 11:00am required a few late additions to represent a feature that was finished ahead of schedule. Bonus!

11:00am Product release!

Today's product release was a soft enablement of our new two-factor authentication options. Typically we'll iterate on a feature over several sprints, deploying to production behind a feature flag. For today's enablement, all that was left to do was flip the switch, enabling the feature for all users. Leading up to this point we ran a small beta group of customers in order to test and gather feedback. In this case, the feature will be available for customers, but we won't push an in-app (Intercom) announcement for a few more days.

Two-factor authentication

11:30am Team stand-up

Pretty typical routine, here. Our team consists of two developers, a Technical Lead, Engineering Manager, and Product Manager. We each talk through what we accomplished yesterday and what we plan to tackle today.

1:30pm - 2:45pm Product Marketing Monthly Sync

This is a time for the two teams to talk through what product has released in the past 30 days and what's expected in the coming 30 days. This helps drive Product Marketing's campaigns going forward and we talk through what they have planned on their roadmap.

3:00pm Discovery Check-in

We're working toward a dual track development process. For us, that means in addition to the normal development process of fixing bugs and delivering new features, we're also spending time in "Discovery" doing research around a larger problem we'd like to tackle in the coming weeks. Currently, we're working through 6-week chunks. Presumably, what we research in the current six weeks will be validated and ready to develop over the next 6 weeks. Discovery includes defining a problem and then crafting potential solutions with the help of Product Design and the team's Engineering Lead. That's then validated with customers and we decide whether or not to move forward.

A 6 week cycle roadmap with discovery

4pm Pride Celebration

The day ended a little early with the arrival of snow cones and a local non-profit coming into the office to give a talk about the work they're doing with LGBTQIA youth in Nashville and surrounding counties.

Post Day

While my day winds down, half our team is two time zones away on the west coast. I do my best to keep an eye on Slack should anything arise that requires collaboration. Today we had a late bug arise around users and API keys. This required a few minutes of defining the expected outcome of the fix.

That's largely it. Key items missing from today that do show up regularly on other days include customer and prospect conversations and larger team meetings to help groom and plan future work.

How does your typical day compare?

APRS beacon with Uputronics GPS Board and a Raspberry Pi 3

10 February 2018 at 07:25

APRS beacon with Uputronics GPS Board and a Raspberry Pi 3

February 10, 2018

For more content like this, subscribe to Amateur Radio Weekly, a weekly summary of the most relevant content in the world of Ham Radio. ♦

This is a detailed, step-by-step guide to using an Uputronics GPS board as the main component of a Pi based APRS position beacon. This project is very similar to my previous APRS beacon project using a USB GPS module. I worked very closely with Chris, K7AZ who was very gracious to lend out the GPS board for this project.

The Uputronics GPS board is typically used as a timing device/NTP server for the Pi in absence of a real time clock (using date and time info via GPS). For this project I'm more interested in the positional data coming from the board.

The board itself is a great piece of hardware. The GPS signal locks in almost instantly. Even in the house. It performs much better than the USB GPS modules I've used in the past. On to the guide.

Download Raspian Stretch Lite, write to SD card

https://www.raspberrypi.org/downloads/raspbian/

I'm using the CLI only version to keep the system load low, but this should work just the same with the desktop environment version.

I used the Ubuntu Startup Disk Creator app that comes with Ubuntu to create my SD card.

Attach GPS board, antenna, SD card to Pi.

Boot Pi

Configure Pi with raspi-config app

$ sudo raspi-config

Set locale to en us UTF-8.

Timezone
Keyboard: English (US), generic PC keyboard
Wi-fi US

Interfacing options menu
Enable SSH
P6 Serial -> Login Shell (no) Hardware (yes)

Board setup

I borrowed most of the GPS board setup from Anthony Stirk (M0UPU).

$ sudo nano /boot/config.txt

Add at the bottom:

# Allow the normal UART pins to work
dtoverlay=pi3-miniuart-bt
dtoverlay=pps-gpio,gpiopin=18

Save and Quit Nano.

Enable wi-fi

$ sudo nano /etc/wpa_supplicant/wpa_supplicant.conf

Go to the bottom of the file and add the following:

network={
Β ssid="wifi-name"
Β psk="wifi-password"
}

Wifi reference.

Update the Pi

$ sudo apt update $ sudo apt upgrade

Disable bluetooth serial support (I'm guessing the GPS serial connection conflicts with the Bluetooth serial interface)

$ sudo systemctl disable hciuart

Not sure what this one does

$ sudo systemctl mask serial-getty@ttyAMA0.service

PPS tools (pulse per second) reads the GPS board which sends GPS/time info every second

$ sudo apt install pps-tools $ sudo apt install libcap-dev $ sudo reboot

Verifying PPS is working

Ensure the GPS has a signal lock and the green PPS LED on the Uputronics Pi+ GPS Expansion Board is blinking once per second.

$ dmesg | grep pps

Output should be similar to:

[ 2.443494] pps_core: LinuxPPS API ver. 1 registered
[ 2.446699] pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
[ 2.471796] pps pps0: new PPS source pps.-1
[ 2.471886] pps pps0: Registered IRQ 498 as PPS source
[ 6.965166] pps_ldisc: PPS line discipline registered
$ sudo ppstest /dev/pps0

Output should be similar to:

trying PPS source "/dev/pps0"
found PPS source "/dev/pps0"
ok, found 1 source(s), now start fetching data...
source 0 - assert 1418933982.998042450, sequence: 970 - clear 0.000000000, sequence: 0
source 0 - assert 1418933983.998045441, sequence: 971 - clear 0.000000000, sequence: 0

If you see β€œConnection timed out,” the GPS may not have a solid signal or the board may not be properly set on the Pi. This tripped me up for a while.

Set up GPSD

Edit gpsd config
This took a while to figure out. The pi version assumes a USB GPS device will be attached, so we have to disable USB auto config and define the serial device in the config file.

$ sudo nano /etc/default/gpsd Change USBAUTO="false" Change DEVICES="/dev/ttyAMA0" $ sudo /etc/init.d/gpsd restart

Test gpsd

$ cgps -s

You should see GPS info populated: Time, lat, lon, grid square, etc.

Reboot for good measure and run cgps -s again.

Setting up the local web server and gpsd script

Install apache web server which will read GPS info and feed the APRS script

$ sudo apt install apache2 -y

Install PHP which will run our APRS script

$ sudo apt install php libapache2-mod-php -y

Install gpsd script that feeds GPS info to APRS script

$ cd /var/www/html $ sudo wget http://git.savannah.gnu.org/cgit/gpsd.git/plain/gpsd.php.in

Rename gpsd script

$ sudo mv gpsd.php.in gpsd.php

Execute gpsd script

$ sudo php gpsd.php

If successful, you should see a bunch of HTML in the console and navigating to http://localhost/gpsd.php?op=json in your Pi's web browser should produce a bunch of plaintext looking GPS information.

Edit gpsd.php

$ sudo nano gpsd.php

On line 100, change the 2000 value to 4000.

Install AFSK software modem

This is a Python library that generates Bell 202 AFSK audio samples and AFSK encoded APRS/AX.25 packets.

$ sudo apt install python-pip python-dev $ sudo pip install afsk

If you get a "TypeError: unsupported operand type(s) for -=: 'Retry' and 'int'", try rebooting and run sudo pip install afsk one more time.

Set your Pi's output audio to 60.

$ alsamixer

Press the 'up' key on your keyboard until you get to 60. Press 'esc.'

Force audio output through 3.5mm jack, not HDMI

$ sudo raspi-config

Choose "Advanced Options"
Choose "A4 Audio"
Choose option 1
Exit

Install main APRS script

Download main beacon script from Github

$ cd ~/ $ wget https://gist.githubusercontent.com/Cale/699979c3f597378dfaca/raw/538f95b73efbf808004e785bff3d407e2da2ce36/aprs-position-beacon.php

Edit the script and add your callsign to line 13

$ nano aprs-position-beacon.php

Test run the script

$ php aprs-position-beacon.php

You can ignore any "PHP Notice" messages. You'll see lat lon info on the screen. If the GPS board is unable to pick up a signal, you'll see "No GPS data is available."

When a GPS signal is acquired, you should see "Transmitting beacon" and "Playing WAV."

At that point you should see a "packet.wav" file show up in your home directory. That's another confirmation the script is working.

$ ls ~/

If you plug the Pi's audio into a speaker, you should hear the APRS/AFSK modem sounds.

For the HT, I plug the audio cable into the mic port and turn on VOX. Your HT should then transmit your position!

Next steps: Starting the beacon script on boot.

❌
❌