Back to archive
Issue #10··22 min read·11 stories

a16z Raises $15B, Meta Secures Nuclear AI Power

China leads open-weight AI, RL for LLMs scales up, and a powerful new LLM fine-tuning tool.

Andreessen Horowitz raised $15 billion yesterday, signaling continued investor confidence in the AI market, particularly relevant for founders fundraising. Meta also signed a massive nuclear power deal to fuel its AI supercluster, a contextual point for future compute availability and energy costs. For builders, new research on scalable RL for LLMs with torchforge and Weaver offers methods to improve training, and the ms-swift tool provides a unified framework for fine-tuning over 600 LLMs.

NEWS
4 stories
2

Stanford: China Led Open-Weight AI in 2025

A Stanford analysis reports China took the global lead in open-weight AI development during 2025, with Chinese models surpassing US counterparts in worldwide distribution and adoption. This shift brings geopolitical and security risks to the forefront.

3

Meta Buys 6.6 GW Nuclear Power for AI

Meta signed a deal to buy 6.6 gigawatts of nuclear power by 2035, becoming one of the largest corporate nuclear energy purchasers. This power will fuel its next-gen AI infrastructure, including the Prometheus supercluster.

4

Terrence Tao: AI tackles Erdos math problems

Fields Medalist Terence Tao recently commented on the emerging application of AI tools to Erdos problems. His brief remark points to AI's expanding role in fundamental mathematical research.

TECHNICAL
2 stories
1

Meta Simplifies LLM RL with New Tools

Meta and PyTorch open-sourced `torchforge`, a PyTorch-native library for scalable LLM post-training, and `Weaver`, a weak verifier for annotation-free reward signals. This stack achieved 4x faster iteration and 65% GPU utilization on a 512-GPU cluster, showing performance gains on MATH, GPQA, and MMLU Pro for Qwen3-8B and Qwen3-32B models.

2

Senior Engineer's Claude Code Workflow Tips

A senior engineer details integrating Claude Code into dev workflows. They suggest using AI to cut interruptions, move up abstraction layers, and ensure code quality with rigorous testing. Tips include structured documentation for agents and favoring Rust/TypeScript over Python.

ANALYSIS
4 stories
1

a16z: More Than Just a VC Fund

A recent analysis frames Andreessen Horowitz (a16z) as a "Firm" that actively builds and sells power to its portfolio companies, not merely a fund. Their strategy involves extensive platform services, from marketing to government relations, as seen with Databricks.

2

LLMs Reproduce Training Data, Raising Copyright Risk

Research shows major AI models like GPT, Claude, and Llama can reproduce large portions of their training data, including entire books. This suggests models act more like "lossy compression" than true learners, opening AI companies to significant copyright infringement liabilities.

3

Claude Code Finds Cancer Drug Targets

An author used Claude Code to quickly identify cancer-selective gene targets from the DepMap dataset, a task previously too complex and time-consuming. This case study shows LLMs can generate 100% of the code for ambitious data analysis projects, uncovering promising targets like YRDC and TFRC.

4

Analysis: Open APIs Are Ending

An analysis argues the open API era is closing, with major software players like Salesforce and Datadog tightening access and data portability. AI's speed lets incumbents expand their offerings, pushing startups and incumbents toward end-to-end stack ownership.

TOOLS
1 story
1

Fine-Tune 900+ LLMs with `ms-swift` Framework

The `ms-swift` GitHub repository offers a comprehensive framework for fine-tuning over 600 LLMs (like Qwen3, Llama4) and 300 MLLMs (Qwen3-VL, InternVL3.5). It supports various techniques including PEFT, full-parameter, CPT, SFT, DPO, and GRPO.