The US court rejects the humanitarian ban imposed by the Trump administration


A US federal court in San Francisco halted major actions taken by the Trump administration against Anthropic, granting the artificial intelligence company temporary relief. The ruling blocks a Pentagon designation that labeled the company a supply chain risk, and suspends a directive that ordered federal agencies to stop using its artificial intelligence system, Cloud.

Judge Rita Lin issued the preliminary order, stating that the government’s actions lacked legal support. As a result, the court found that the measures appeared harsh and not based on national security concerns.

Anthropic obtains court order against Trump administration actions

the Court order Prevents enforcement of restrictions imposed by the Trump administration. These restrictions follow the Pentagon’s decision to classify Anthropics as a supply chain risk. The directive also requires federal agencies to stop using the company’s chatbot.

Judge Lane stated that no law supports the designation of a US company as an adversary on the basis of disagreement with US government policy, more broadly Federal policy discussions continue Across multiple sectors. She described the procedures as “arbitrary” and an abuse of discretion. In addition, the ruling highlighted that such actions could harm the company’s position in the enterprise AI market.

According to data from Menlo Ventures, Anthropic has a 32% share of the enterprise AI sector in 2025. This puts it ahead of competitors, including OpenAI, with 25%. The court noted that a government-wide ban could reduce this situation.

The injunction follows a lawsuit filed by Anthropic on March 9 in Washington, D.C., in which the company alleged that Defense Secretary Pete Hegseth exceeded his authority. Specifically, the lawsuit challenged the national security designation and related restrictions.

Contract disputes and policy differences lead to conflict

The dispute dates back to an agreement between Anthropic and the Pentagon in July 2025. The goal of the contract was to make Claude the first frontier AI model certified for covert networks. However, negotiations collapsed in February after the Pentagon asked to modify the terms.

The Department of Defense has requested unrestricted military use of Claude for all lawful purposes. The Anthropist rejected these conditions. The company claimed that its technology should not support lethal autonomous weapons or mass domestic surveillance.

After this collapse, the administration escalated its measures against the company. On February 27, President Trump ordered federal agencies to stop using human products. The Pentagon also moved to classify the company as a supply chain risk.

During a March 24 hearing in San Francisco, Judge Lane questioned government lawyers about their motives. She wondered whether Anthropic was facing sanctions for its public criticism of the Pentagon’s position. The court later concluded that the actions akin to First Amendment retaliation.

Additional developments as the United States considers stopping the escalation of the war with Iran

The court’s decision comes as the Trump administration evaluates its broader foreign policy stance. As CoinGape reported, The United States is preparing for peace talks In the Iran war as President Trump considers scaling back military efforts, officials are now exploring negotiations as the conflict enters its fourth week and impacts global markets, including oil prices.

According to Axios, intermediaries such as Egypt, Qatar, and the United Kingdom exchanged messages between the United States and Iran. These contacts reveal that Iran is open to negotiations, although the conditions remain strict.

Iran has reportedly requested a ceasefire, guarantees against renewed conflict, and compensation. In return, the United States seeks to obtain several commitments. These include temporarily pausing missile development, not enriching uranium at all, and imposing restrictions on funding for the agency.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *