China Prepares UN Resolution To Tap AI For Good

3D rendering artificial intelligence AI research of robot and cyborg development for future of people living. Digital data mining and machine learning technology design for computer brain.

China is preparing a United Nations General Assembly resolution that it says is intended to help close gaps between rich and developing countries in the advance of artificial intelligence, an initiative that follows an extensive and ambitious campaign by the US, its biggest AI competitor.

“The rapid development of AI technology has not fully benefited the vast majority of developing countries,” Tao Wang, a spokeswoman for the Chinese mission to the UN, said Thursday.

The nonbinding resolution would focus on capacity-building to help developing countries fully benefit from AI, as well as to bridge digital divides among countries of varying resources so that “no one will be left behind,” Wang added, without providing further details.

The Chinese effort follows the General Assembly’s adoption of a US-led resolution that encourages countries to support “responsible and inclusive” AI development through domestic regulations and governance, an initiative that garnered the support of more than 110 countries — including China’s last-minute inclusion to the list of co-sponsors.

Washington and Beijing are engaged in an extensive AI and semiconductor race for dominance of the technology field. President Joe Biden has kicked off an US$11 billion push to bolster American leadership in chips research and development while curbing Beijing’s access to leading-edge technology. China, for its part, is working hard to narrow a gap in AI investment with the US.

Wang said the resolution would complement initiatives by other countries and that China will work with member states to reach consensus as soon as possible. – Bloomberg

Previous articleEPF Flexible Accounts Help Meet Contributors Emergency Needs, Experts Say
Next articleSAS Brings In SAS Viya Workbench Developer Environment For Building AI Models

LEAVE A REPLY

Please enter your comment!
Please enter your name here