Latest news, Wikipedia summary, and trend analysis.
This topic has appeared in the trending rankings 1 time(s) in the past year. While it does not trend frequently, its appearance suggests a renewed or concentrated surge of public interest.
Based on Wikipedia pageviews and search interest, this topic gained significant attention on the selected date.
This topic is not currently in the ranking.
The Safe and Secure Innovation for Frontier Artificial Intelligence Models Act, or SB 1047, was a failed 2024 California bill intended to "mitigate the risk of catastrophic harms from AI models so advanced that they are not yet known to exist". Specifically, the bill would have applied to models which cost more than $100 million to train and were trained using a quantity of computing power greater than 1026 integer or floating-point operations. SB 1047 would have applied to all AI companies doing business in California—the location of the company would not matter. The bill would have created protections for whistleblowers and required developers to perform risk assessments of their models prior to release, with guidance from the Government Operations Agency. It would also have established CalCompute, a University of California public cloud computing cluster for startups, researchers and community groups.
Read more on Wikipedia →This topic has recently gained attention due to increased public interest. Search activity and Wikipedia pageviews suggest growing global engagement.
Search interest data over the past 12 months indicates that this topic periodically attracts global attention. Sudden spikes often correlate with major news events, public statements, or geopolitical developments.