mirror of
https://github.com/blackboxprogramming/alexa-amundson-resume.git
synced 2026-03-18 06:34:09 -05:00
RoadChain-SHA2048: 9f948f149bd9f508 RoadChain-Identity: alexa@sovereign RoadChain-Full: 9f948f149bd9f508d25792c617d1c4049cf814c3acbb3181886684f1d89e2ab84fdb0364ce227ef1c03c0b59335e5d1aad9434f983ad375d50eca597e7daea8f9bb2a3e40116fa13de0453865ff2665fb759fc63204fe222360becc3b8c447fb1fbe7e10a440e8107745b57c643682cb2e4f7cffbb9c8c0e1bc5b03623fcbd41d0ab39740c02f148d5309591013f3d65810692706da448cf7e04b4368ef3738898fcc0f2414377cf1ff1f5897a27cfd96289c1f1875a3a93ec732453686f07621952135ae7df10cce155ebc206d3d3a3a9931fc7683d635c74b67d080fc170a8b8238a9eda91ba9193aaeb17737276b9140330cf622d656efdb3e968f46d1a24
2.4 KiB
2.4 KiB
Alexa Amundson
Edge Computing Engineer
amundsonalexa@gmail.com | github.com/blackboxprogramming
Summary
Edge computing engineer operating a 5-node Raspberry Pi fleet with 52 TOPS AI acceleration, 27 deployed models, WireGuard mesh networking, and carrier-grade WiFi mesh. Builds edge-native services with self-healing automation, thermal management, and hybrid edge-cloud architecture.
Experience
BlackRoad OS | Founder & Edge Lead | 2025–Present
Edge Fleet
- 5 Raspberry Pi nodes: 4× Pi 5 (8 GB RAM, NVMe), 1× Pi 400 (4 GB RAM)
- 2× Hailo-8 NPUs (26 TOPS each) for on-device AI inference
- 707 GB total fleet storage, 20 GB total RAM
- Docker Swarm orchestration with automatic service placement
Edge AI
- 27 Ollama models (48.1 GB) running locally across 3 nodes
- 4 custom fine-tuned models for domain-specific inference
- SSE proxy for streaming model responses to web clients
- Image generation pipeline with 4 backend agents
Edge Networking
- RoadNet: 5 WiFi access points (channels 1/6/11), dedicated 10.10.x.0/24 subnets
- WireGuard mesh VPN (10.8.0.x) connecting all nodes to cloud hub
- 4 Cloudflare tunnels for secure external access
- Pi-hole DNS, PowerDNS, custom dnsmasq zones at edge
Edge Reliability
- Self-healing cron automation on every node
- Power optimization: CPU governors, voltage tuning, thermal throttle prevention
- Avg fleet temperature: 44.8°C (down from 73.8°C peak after optimization)
- 256 systemd services managed across fleet
Hybrid Architecture
- Edge nodes handle AI inference, local services, DNS, monitoring
- Cloud (Cloudflare) handles 99 Pages deployments, 22 D1 databases, CDN
- DigitalOcean VMs as WireGuard hubs and public endpoints
- Tailscale overlay (9 peers) for cross-network management
Technical Skills
Edge: Raspberry Pi 5, Hailo-8 NPU, NVMe, PCIe, GPIO, I2C Networking: WireGuard, WiFi mesh, Cloudflare Tunnels, DNS (Pi-hole, PowerDNS) AI: Ollama, Hailo-8 inference, custom model fine-tuning Containers: Docker, Docker Swarm Automation: systemd (256 services), cron (52 tasks), self-healing scripts
Metrics
| Metric | Value |
|---|---|
| Edge nodes | 5 |
| AI acceleration | 52 TOPS |
| Models deployed | 27 (48.1 GB) |
| WiFi APs | 5 |
| Fleet storage | 707 GB |
| Avg temperature | 44.8°C |
| Services | 256 |