A private research initiative exploring AI, technology, construction, and beyond.
My recent work explores a foundational shift of AI compute by eliminating floating point arithmetic entirely. I've demonstrated that full transformer models can be trained using integer-only pipelines — a shift that unlocks enormous gains in performance, efficiency, and hardware simplicity.
Trained a GPT-style transformer using only integer arithmetic — no floating point at all.
This is a potentially world-first milestone that demonstrates transformer training can be achieved with integer-only math — eliminating floating point units entirely. It marks a foundational shift in how large models can be trained and deployed, enabling faster, simpler, lower-power hardware across the AI stack.
Matched FP32-level accuracy on MNIST and CIFAR-10 using fully integer pipelines.
These foundational benchmarks were used as early tests to incrementally validate the stability and learning fidelity of the integer-only training pipeline.
One-cycle integer MAC outperforms HardFloat FP32 across all key metrics.
Speed/Area: 1.79x — Energy/Area: 5.98x — Speed/Energy: 5.27x
This is a disruptive apples-to-apples baseline showing that integer arithmetic can outperform floating point on its own terms — and it doesn't even include the additional systemic gains integer designs enable. It breaks the traditional performance ‐ power ‐ area triangle by improving all three simultaneously.
Implemented entirely with open-source tools (OpenLane, Yosys) and no architectural optimization.
These results were achieved without fine-tuning ‐ suggesting significant headroom remains for even greater gains with dedicated design effort.
Design is radically simpler, accelerating the path from concept to silicon.
Simplified hardware reduces development cost, improves verification timelines, and lowers barriers to innovation across the AI hardware stack.
These accomplishments point to a foundational shift in AI compute ‐ impacting everything from chip design to data center infrastructure and beyond.
Supporting documentation is available upon request for qualified technical reviewers, collaborators, or investors.
Freeman Constructs is a personal innovation lab and future-facing brand for private research, engineering, and venture exploration. It includes work in AI, tech architecture, construction concepts, and investment strategy. Current focus is on a hardware-level breakthrough in AI compute: full transformer training using integer arithmetic ‐ a potential paradigm shift in energy-efficient AI hardware.
Aaron Freeman is a lifelong technologist and entrepreneur. He began coding at age 8, launched a BBS in 1983, and later earned a B.S. in Computer Science and an M.S. in Electrical Engineering (with a focus on computer architecture) from Wichita State University.
While pursuing his Ph.D. he was awarded a Teaching Fellowship, where he taught various programming, computer architecture, genetic algorithms, and other courses. He founded Layer-Z, a consulting firm serving major clients including AmeriServe (formerly PepsiCo Food Services). He was awarded a U.S. patent for the first internet-connected sprinkler system (US-7123993-B1) and founded SendThisFile, the first file-transfer-as-a-service platform.
He and his wife Larissa are active investors and entrepreneurs across sectors including retail, real estate, and infrastructure. He holds a Class A General Contractor's License and has led development efforts in robotics, cloud, real estate, trading, and AI.