I need your money. It's for robots.

Hey, look. It's 2024 and consumer VRAM in a GPU is still really expensive on cards that play nicely with AI training. It turns out that applications running inference with large language models like to live in big houses that have at least 16gb of space. Still pretty rough out there for anything over 8gb. Sheesh.

Goodness

A single tear for: the abandoned Tesla P40s with a tempting price tag and a dwindling amount of documentation, all of the EOL Nvidia Jetson development kits somehow still being sold for quite a ways over MSRP, and every last 2060.

Anyway, back to the issue. I need money. Possibly yours, or the government's, or... hint hint.. some of those R & D dollars after you hire me to develop some CRAZY ideas involving natural language processing.

I mean, it could just be for regular software stuff too. I enjoy making systems and stacks work and integrate.

Speaking of which,

Hi there. My name is Hunter. I'm all sorts of things, but if you have ended up here, then for you, I am your adventurous ML engineer, versatile software developer, or an enthusiastic information technology specialist. If I were a class, these would be my implements.

This page needs a little. In the meantime, here's my LinkedIn: https://www.linkedin.com/in/hwilliamsf/.

Stay tuned.

TBA