Back to archive
Issue #31··22 min read·11 stories

AI Writes 100% of Anthropic's Code

Amazon & Google chip gains against Nvidia; Anthropic hit with $3B copyright lawsuit; new self-hosted agent.

Top engineers at Anthropic and OpenAI stated yesterday that AI writes 100% of their code, signaling a radical shift in how advanced AI products are built. Separately, music publishers sued Anthropic for $3 billion over alleged copyright infringement, raising critical questions for AI content generation and IP strategy. Meanwhile, Amazon and Google are reportedly eating into Nvidia's AI chip market share, a trend impacting future compute availability and pricing.

NEWS
8 stories

Music Publishers Sue Anthropic for $3B Over 20,000 Allegedly Pirated Songs

Major music publishers, including Universal Music Group, filed a $3 billion federal copyright lawsuit against Anthropic. They allege 'flagrant piracy' of over 20,000 copyrighted musical works, including songs and lyrics, used to train Claude AI models without authorization. A ruling could accelerate the adoption of licensed training datasets or push toward compulsory licensing, directly impacting how builders source and pay for AI training data.

Read full story
2

100% AI-Generated Code Now Used by Engineers at Anthropic, OpenAI

Engineers at Anthropic and OpenAI reportedly have 100% of their code generated by AI models like Claude Code. Key figures, including Anthropic's Boris Cherny, state they no longer write code manually, instead focusing on editing and creative problem-solving. This indicates a future where engineers offload tedious coding to AI, allowing them to focus on creative problem-solving and generalist skills, aligning with predictions that AI will soon complete most engineering work.

3

Merger Talks Underway for SpaceX, Tesla, xAI

Elon Musk's SpaceX, xAI, and Tesla are reportedly in early-stage talks for a potential merger, with scenarios including SpaceX combining with Tesla or xAI. This consolidation aims to align resources and could facilitate xAI's goal of placing data centers in space. Discussions follow recent corporate filings and $2 billion investments from SpaceX and Tesla into xAI.

5

Model S & X Production Ends as Tesla Pivots to AI

Following its first annual revenue fall, Tesla is pivoting from electric vehicle production to AI and robotics. The company will cease Model S and X production, invest $2 billion in xAI, and convert its California factory to produce Optimus robots. This strategic shift repositions Tesla as a "physical AI company" amid declining EV sales.

6

Custom Chips from Google & Amazon Cut Into Nvidia's Lead

Google and Amazon are expanding their custom AI chip sales, challenging Nvidia's market share. Google's TPUs generated tens of billions in revenue, including a $21 billion deal to supply Anthropic with chips for non-Google data centers. Amazon's Trainium chips also saw revenue grow 150% quarterly, partly due to its $4 billion Anthropic investment. Nvidia still commands 92% of the market, but these trends signal a growing market for Nvidia alternatives, offering builders more diverse compute options as software support expands.

7

Q.ai's Whisper Audio Tech Acquired by Apple for $2B

Apple acquired Israeli AI startup Q.ai for nearly $2 billion, marking its second-largest acquisition. Q.ai's technology includes interpreting whispered speech and clarifying audio in noisy environments using imaging and machine learning. This acquisition could integrate Q.ai's audio AI into products like AirPods and Vision Pro, with the founding team joining Apple.

8

AGI Needs New Breakthroughs, Not Just Scale, Says DeepMind CEO

Google DeepMind CEO Demis Hassabis says AGI, defined as full human cognitive ability including creativity and physical intelligence, is 5-10 years out. He argues AGI requires breakthroughs in continual learning, memory, and long-term reasoning, not just scaling LLMs. Hassabis also indicates Google sees AI-powered smart glasses as the "killer app" for a universal digital assistant.

TECHNICAL
4 stories
1

Optical Networking Becomes AI Inference Bottleneck

As AI shifts from GPU compute to inference, optical networking is now the primary bottleneck. Future AI data centers will require photon-based interconnects like Co-Packaged Optics (CPO) and Silicon Photonics for power efficiency and bandwidth, replacing copper. This transition shifts power dynamics across the supply chain.

2

DNA Sequence Model Beats Rivals on 25/26 Variant Effect Prediction Benchmarks

AlphaGenome is a deep learning model that processes 1 Mb of DNA to predict thousands of human and mouse genomic tracks across 11 modalities, including gene expression and splicing. It outperforms specialized models on 25 of 26 benchmarks for regulatory variant effect prediction. The model uses a two-stage training process (pretraining and distillation) for multimodal variant interpretation at single-base-pair resolution.

3

Deploy AI Agents on Cloudflare, Skip Dedicated Hardware

Cloudflare's Moltworker project runs the Moltbot AI agent on Cloudflare's edge, removing the need for local machines. It uses Workers, Sandbox SDK for code execution, R2 for storage, AI Gateway for model access, and Browser Rendering for web automation. This offers a model for deploying AI agents on the edge, providing efficiency, security, and observability.

4

AI Code Assistance Lowers New Skill Mastery by 17%

A randomized trial found AI assistance slightly speeds coding tasks, though this difference was not statistically significant, but hinders skill mastery, especially debugging. Participants using AI scored 17% lower on a quiz testing new concepts. However, those using AI for conceptual inquiry and comprehension retained more knowledge, suggesting the interaction method matters for learning.

ANALYSIS
1 story
1

Electrek: Tesla's Shift to AI is 'Automotive Suicide'

Tesla's Q4 2025 earnings call revealed a pivot from traditional car manufacturing to 'transportation as a service' and robotics, discontinuing Model S/X and new mass-market vehicles. Electrek argues this decision sacrifices an $80 billion automotive business for unproven ventures, calling it 'automotive suicide' despite declining revenue and deliveries.

TOOLS
1 story
1

6 Prompting Tips for Better LLM Output

Unimpressive LLM answers often stem from generic prompts. This article offers six strategies for ChatGPT, Claude, and Gemini, including defining roles, providing detailed context, and asking the AI to 'think step by step'. Applying these methods results in more effective answers, reframing AI as a sharp colleague.