This article was originally published on VOA News - Technology. You can read the original article HERE
ChatGPT developer OpenAI announced Tuesday it is establishing a safety committee as it trains its latest artificial intelligence model, the GPT-4 system of its chatbot.
The committee will include OpenAI CEO Sam Altman, board members and other executives. The company said the body will spend the next 90 days strengthening OpenAI's processes and safeguarding advanced AI development from potential misuse and exploitation.
The committee will make recommendations to the full board on “critical safety and security decisions for OpenAI projects and operations,” the company said in a statement.
The announcement comes weeks after key executives departed the company.
Researcher Jan Leike, who resigned from OpenAI earlier this month, said the company’s “safety culture and processes have taken a back seat to shiny products.”
Ilya Sutskever, OpenAI co-founder and chief scientist, also resigned. “I’m confident that OpenAI will build AGI [artificial general intelligence] that is both safe and beneficial,” he said on the social media site X, formerly Twitter.
Leike and Sutskever jointly led the company’s "superalignment" team, dedicated to reducing long-term AI risks, which disbanded after their departures.
OpenAI has faced backlash over allegations that a voice for ChatGPT copied that of actress Scarlett Johansson. The company denied trying to impersonate Johansson.
The new committee will publicly release its recommendations for the company following its meeting with the full board in the fall.
"We welcome a robust debate at this important juncture,” OpenAI’s statement said.
Some information for this report was provided by The Associated Press and Agence France-Presse.
This article was originally published by VOA News - Technology. We only curate news from sources that align with the core values of our intended conservative audience. If you like the news you read here we encourage you to utilize the original sources for even more great news and opinions you can trust!
Comments