mirror of
https://github.com/blackboxprogramming/alexa-amundson-resume.git
synced 2026-03-18 05:34:08 -05:00
20 role-specific resumes with verified KPIs — BlackRoad only, no prior experience
RoadChain-SHA2048: 428ab11c02ce78d6 RoadChain-Identity: alexa@sovereign RoadChain-Full: 428ab11c02ce78d628aa30489d9f0f3251e709352f2deacf05882435ed9f5d114fe2a1c9e75b3c831688f47cd9032c22b388f821b1b29dcac9fc9a3ad4a1b39f1210d1275f9472df606b763bb551961d1eaebfe8f2a4b9c23d3f3da3f001d916e03ff920def04c8304d8544ac916e4c50c16da942dcc830388e298b7c016b991320b30f7d3fe153aaab71ab109aea3f9dca996ac6e14ca1c0969248c8ca2767ab631c17dc86c0c2a8edd1c8965ab3ba6c92ba7cc9aa4d74406058a39d8fdec53a200371b7d1e1214a860a7ff2c53b83b09f516cec69cbe00e3556caee7f813e4a09d3f430a3a3eab5d4763f8975999c31bd77f82972ab8d7c2d7c5aedcce9442
This commit is contained in:
71
roles/10-edge-computing-engineer.md
Normal file
71
roles/10-edge-computing-engineer.md
Normal file
@@ -0,0 +1,71 @@
|
||||
# Alexa Amundson
|
||||
|
||||
**Edge Computing Engineer**
|
||||
|
||||
amundsonalexa@gmail.com | [github.com/blackboxprogramming](https://github.com/blackboxprogramming)
|
||||
|
||||
---
|
||||
|
||||
## Summary
|
||||
|
||||
Edge computing engineer operating a 5-node Raspberry Pi fleet with 52 TOPS AI acceleration, 27 deployed models, WireGuard mesh networking, and carrier-grade WiFi mesh. Builds edge-native services with self-healing automation, thermal management, and hybrid edge-cloud architecture.
|
||||
|
||||
---
|
||||
|
||||
## Experience
|
||||
|
||||
### BlackRoad OS | Founder & Edge Lead | 2024–Present
|
||||
|
||||
**Edge Fleet**
|
||||
- 5 Raspberry Pi nodes: 4× Pi 5 (8 GB RAM, NVMe), 1× Pi 400 (4 GB RAM)
|
||||
- 2× Hailo-8 NPUs (26 TOPS each) for on-device AI inference
|
||||
- 707 GB total fleet storage, 20 GB total RAM
|
||||
- Docker Swarm orchestration with automatic service placement
|
||||
|
||||
**Edge AI**
|
||||
- 27 Ollama models (48.1 GB) running locally across 3 nodes
|
||||
- 4 custom fine-tuned models for domain-specific inference
|
||||
- SSE proxy for streaming model responses to web clients
|
||||
- Image generation pipeline with 4 backend agents
|
||||
|
||||
**Edge Networking**
|
||||
- RoadNet: 5 WiFi access points (channels 1/6/11), dedicated 10.10.x.0/24 subnets
|
||||
- WireGuard mesh VPN (10.8.0.x) connecting all nodes to cloud hub
|
||||
- 4 Cloudflare tunnels for secure external access
|
||||
- Pi-hole DNS, PowerDNS, custom dnsmasq zones at edge
|
||||
|
||||
**Edge Reliability**
|
||||
- Self-healing cron automation on every node
|
||||
- Power optimization: CPU governors, voltage tuning, thermal throttle prevention
|
||||
- Avg fleet temperature: 44.8°C (down from 73.8°C peak after optimization)
|
||||
- 256 systemd services managed across fleet
|
||||
|
||||
**Hybrid Architecture**
|
||||
- Edge nodes handle AI inference, local services, DNS, monitoring
|
||||
- Cloud (Cloudflare) handles 99 Pages deployments, 22 D1 databases, CDN
|
||||
- DigitalOcean VMs as WireGuard hubs and public endpoints
|
||||
- Tailscale overlay (9 peers) for cross-network management
|
||||
|
||||
---
|
||||
|
||||
## Technical Skills
|
||||
|
||||
**Edge:** Raspberry Pi 5, Hailo-8 NPU, NVMe, PCIe, GPIO, I2C
|
||||
**Networking:** WireGuard, WiFi mesh, Cloudflare Tunnels, DNS (Pi-hole, PowerDNS)
|
||||
**AI:** Ollama, Hailo-8 inference, custom model fine-tuning
|
||||
**Containers:** Docker, Docker Swarm
|
||||
**Automation:** systemd (256 services), cron (52 tasks), self-healing scripts
|
||||
|
||||
---
|
||||
|
||||
## Metrics
|
||||
|
||||
| Metric | Value |
|
||||
|--------|-------|
|
||||
| Edge nodes | 5 |
|
||||
| AI acceleration | 52 TOPS |
|
||||
| Models deployed | 27 (48.1 GB) |
|
||||
| WiFi APs | 5 |
|
||||
| Fleet storage | 707 GB |
|
||||
| Avg temperature | 44.8°C |
|
||||
| Services | 256 |
|
||||
Reference in New Issue
Block a user