Skip to playerSkip to main content
  • 7 weeks ago
Anthropic is gearing up for a significant legal showdown after the Pentagon officially classified the AI firm as a "supply chain risk." This marks the first instance of a US technology company receiving such a designation. The action comes in the wake of unsuccessful negotiations, public backlash from the former president, and escalating concerns regarding the application of AI in national security.

Category

πŸ—ž
News
Transcript
00:00What just happened between Anthropic and the Pentagon has shocked the tech world,
00:04and it might reshape the future of AI in the U.S.
00:06The Pentagon has officially labeled Anthropic a supply chain risk.
00:11It is the first time in history that an American AI company has ever been given this designation.
00:16And Anthropic says the decision is not just unfair. It is illegal.
00:21Anthropic CEO Dario Amode says they refused to give the military unrestricted access to their AI tools.
00:27They feared mass surveillance and autonomous weapons.
00:30And now the company believes it is being punished for it.
00:34Talks with the Pentagon were close to a deal, until a sudden blast from President Trump on social media.
00:39He ordered all federal agencies to stop using Anthropic, saying the government would never work with them again.
00:45Hours later, the Pentagon's decision became official,
00:48meaning any company working with the military could be blocked from doing business with Anthropic.
00:54Anthropic says they had no warning.
00:55And they are preparing to fight the designation in court.
00:59Meanwhile, Microsoft says it will still use Anthropic's tech, just not for defense projects.
01:04And millions of people continue signing up for Claude every single day.
01:08This battle is far from over.
01:10And the outcome could define how much control the U.S. government has over the future of AI.
01:15Therefore, if you don't find a hole in the hand stuff, you will be done here.
01:15Click the next one, SΓ³ One oils and says vari νƒœ νƒœ νƒœ νƒœ νƒœ νƒœ νƒœ νƒœ νƒœ νƒœ
01:15νƒœ νƒœ νƒœνƒœ νƒœ νƒœ νƒœ νƒœ νƒœ νƒœ νƒœ Wa
Comments

Recommended