Liabooks Home|PRISM News

#Anthropic

Total 140 articles

Anthropic Just Closed a Door for Open Source Devs
TechEN
Anthropic Just Closed a Door for Open Source Devs

Anthropic is cutting off third-party tools like OpenClaw from Claude Code subscription limits — right as OpenClaw's creator joins OpenAI. Engineering constraint or competitive move?

The Machine That Teaches Itself: Are We Ready?
CultureEN
The Machine That Teaches Itself: Are We Ready?

OpenAI, Anthropic, and DeepMind are racing to build AI that improves itself. What happens when the pace of AI progress is set by AI—not humans?

Claude's Secret Feature: An AI That Works While You Sleep
TechEN
Claude's Secret Feature: An AI That Works While You Sleep

A surprise leak of Anthropic's Claude Code source code revealed 'Kairos'—a dormant background AI agent designed to act before you even ask. Here's what it means.

PRISM

PRISM by Liabooks

PRISM
Advertise with Us

Place your ad in this space

[email protected]
Anthropic Accidentally Open-Sourced Its Secrets
TechEN
Anthropic Accidentally Open-Sourced Its Secrets

A routine update to Claude Code leaked over 512,000 lines of TypeScript source code, exposing internal AI instructions, unreleased features, and memory architecture. What does this mean for AI transparency?

When the Pentagon Picked a Culture War Over a Contract Dispute
TechEN
When the Pentagon Picked a Culture War Over a Contract Dispute

A California judge blocked the Pentagon from labeling Anthropic a supply chain risk. The 43-page ruling exposes a pattern: tweet first, lawyer later. What it means for AI governance and the limits of government leverage.

PRISM Weekly Digest: When Courts, Charts, and Chokepoints All Broke at Once
DigestEN
PRISM Weekly Digest: When Courts, Charts, and Chokepoints All Broke at Once

A federal judge blocked the Pentagon's retaliation against Anthropic, BTS shattered a decade-old Billboard record, SK Hynix filed for a $10B+ US listing, OpenAI killed Sora, and three war fronts converged around the Strait of Hormuz.

Anthropic Fought the Pentagon — and Its Subscriber Count Won
TechEN
Anthropic Fought the Pentagon — and Its Subscriber Count Won

Claude's paid subscriptions more than doubled in early 2026 as Anthropic's DoD standoff and Super Bowl ads drove record consumer sign-ups. Here's what the data actually shows.

PRISM

PRISM by Liabooks

PRISM
Advertise with Us

Place your ad in this space

[email protected]
The Pentagon Blacklisted an AI Company for Talking to the Press
TechEN
The Pentagon Blacklisted an AI Company for Talking to the Press

A federal judge blocked the Pentagon's blacklisting of Anthropic, ruling that punishing a company for public criticism of government policy is a textbook First Amendment violation.

When an AI Company Sues the Pentagon
TechEN
When an AI Company Sues the Pentagon

Anthropic is fighting back after the Trump administration blacklisted it for limiting military use of its AI. The battle has reached Congress—and it's rewriting the rules of civil-military AI.

Claude Code's Auto Mode Wants to Be Your AI Safety Net
TechEN
Claude Code's Auto Mode Wants to Be Your AI Safety Net

Anthropic's new Auto Mode for Claude Code lets AI flag and block risky actions before they run. It's a clever fix for agentic AI's biggest problem — but who decides what "risky" means?

Claude Can Now Use Your Mouse. Should You Let It?
TechEN
Claude Can Now Use Your Mouse. Should You Let It?

Anthropic's Claude Code and Cowork can now directly control your Mac desktop—clicking, scrolling, and navigating files. As AI agents race to take over local computers, what are the real implications?

PRISM

PRISM by Liabooks

PRISM
Advertise with Us

Place your ad in this space

[email protected]
When an AI Company Told the Pentagon 'No
EconomyEN
When an AI Company Told the Pentagon 'No

Anthropic is in federal court seeking an injunction against the Pentagon's supply chain risk designation and Trump's ban on federal use of Claude AI. Billions in contracts—and a bigger question about AI ethics—hang in the balance.

PRISM

Advertise with Us

[email protected]