Community generated repository of interactive checklists that work on your computer, tablet, or phone by Armadeus Demarzi.
It’s beautiful.
Observations while developing web applications and creating great software.
Community generated repository of interactive checklists that work on your computer, tablet, or phone by Armadeus Demarzi.
It’s beautiful.
The aesthetics of their vision are fantastic.
If you like this, you might also like SRCL which is unrelated but similar in vibes:
SRCL is an open-source React component and style repository that helps you build web applications, desktop applications, and static websites with terminal aesthetics.
XB-1, the first civilian privately funded supersonic aircraft, flew its first supersonic flight yesterday. XB-1 was made in America by Boom which was founded by ex-Groupon executive Blake Scholl.
Coupons to supersonic aircraft. The roaring 20s are back.
This accomplishment was skipped by most news media. The WSJ said supersonic isn’t newsworthy until Overture is carrying passengers. NYT didn’t cover it. But the Financial Times covered it well.
The lessons learned from XB-1 will be used to build Overture to restore commercial supersonic flight: ~15 minutes slower than Concorde over the Atlantic but 75% more affordable.
What an incredible story over a decade in the making. Congratulations to Blake and the team.
Unfortunately, today has become a sad day for aviation with reports of a tragic mid-air collision at Ronald Regan International Airport this evening. My prayers are with the families involved.
I only get to write posts here early or late during the work week. So I still wanted to share this timely news even in this difficult time. I believe Boom’s achievement will inspire future engineers. I share the optimism and love the XB-1 story. At the same time I also hope aviation safety will continue to improve with careful investigation into what went wrong tonight.
Daniel & Michael Han:
DeepSeek-R1 has been making waves recently by rivaling OpenAI’s O1 reasoning model while being fully open-source. We explored how to enable more local users to run it and managed to quantize DeepSeek’s R1 671B parameter model to 131GB in size, a 80% reduction in size from the original 720GB, whilst being very functional.
By studying DeepSeek R1’s architecture, we managed to selectively quantize certain layers to higher bits (like 4bit), and leave most MoE layers (like those used in GPT-4) to 1.5bit (see Unsloth Dynamic 4-bit). Naively quantizing all layers breaks the model entirely, causing endless loops and gibberish outputs. Our dynamic quants solve this.
The 1.58bit quantization should fit in 160GB of VRAM for fast inference (2x H100 80GB), with it attaining around 140 tokens per second. You don’t need VRAM (GPU) to run 1.58bit R1, just 20GB of RAM (CPU) will work however it maybe slow. For optimal performance, we recommend the sum of VRAM + RAM to be at least 80GB+.