By Tom Still

MADISON, Wis. – Nvidia became the world’s most valuable public company for a few days this month – yes, more valuable than even Microsoft and Apple – for one major reason: The boom in generative artificial intelligence means there has been a surge in demand for Nvidia’s “graphic processing units,” which are chips that make it possible to create entire AI systems.

Two years ago, the markets valued Nvidia at $400 billion. On June 18, it surpassed $3 trillion, a sign those markets know AI is here to stay and likely over time to spur productivity gains in industries that range from manufacturing to agriculture, from fast food to marketing, and from telecommunications to security.

What goes up in financial markets often comes down, of course, which can happen in coming years if market forays into finding uses for AI don’t pan out as dramatically as expected. Predicted uses are so broad, however, that chances for widespread adoption of AI over the next five years have even skeptics wondering how far it might go.

None of that is stopping many states from pondering and even passing legislation which – if done in haste or a sense of panic – could crimp innovation in the name of public protection.

Lawmakers in Colorado passed the Colorado AI Act in mid-May, which will impose requirements on developers and deployers of “high-risk” AI systems in such areas as employment, housing, financial services and health care. The law carries obligations for developers and “deployers” for disclosures, risk management and consumer protections. It is set to take effect in early 2026.

California, usually quick to regulate just about anything, has about 50 AI bills under consideration. A major issue there as well as Tennessee is ensuring that actors and musicians don’t have their likenesses and voices used in AI-generated content without their consent.

Connecticut’s state Senate passed an AI bill in April to rein in bias in artificial intelligence decision-making and protect people from possible harm. It has stalled in the state House of Representatives over concerns it would stifle innovation, become a burden for small businesses and make the state an outlier.

Laws to curb deceptive media, especially around elections, have cleared legislatures in California, Texas, Minnesota and Washington; similar laws are pending in New York, New Jersey and Michigan. Maryland requires employers to get employee consent before using AI systems that collect data about them. Georgia, Maine, Maryland, Massachusetts, Pennsylvania, South Carolina, Vermont and others have bills pending to curb AI “profiling” of people.

In all, about 400 bills have been introduced or enacted around the country, which is a regulatory pace that seems to be faster than the development of the industry itself.

Wisconsin may among those states that follow a more deliberate path. Delaware plans to create a bipartisan commission with both public and private members to recommend regulatory paths, and Wisconsin appears to be doing much the same. Through the Legislative Council, a non-partisan service arm of the Wisconsin Legislature since 1947, a study committee has been established.

Its charge is to “review current uses of artificial intelligence technology’’ and make recommendations regarding use and development of such technology. That may include reviewing “the use of artificial intelligence in disinformation and artificial imagery” and other “high-risk” uses. High risk has yet to be defined.

There is no doubt AI is a tool that can be used for good or evil, much like some other technology tools. Because federal regulation isn’t on the horizon, states are trying to fill the void. The danger in a state-by-state approach is a hodge-podge of laws that may leave people and businesses in some states at a competitive disadvantage.

Nvidia and other innovators have proven the market has embraced AI, a tech-blue genie that cannot be crammed back into the bottle. The trick will be adopting regulations that don’t shut down legitimate needs and opportunities while keeping deepfakes, disinformation and privacy breaches at a minimum.

Still is president of the Wisconsin Technology Council. He can be reached at news@wisconsintechnologycouncil.com.