“Queen of the algorithm”: robot artist Ai-Da creates a portrait of Elizabeth II to celebrate the platinum Jubilee

A robot artist has created a disturbing new portrait of Queen Elizabeth IIto celebrate that of the monarch Platinum Jubilee.

Entitled “Algorithm Queen”, the portrait was painted by Ai-Da, the humanoid robot artist created by gallery director Aidan Meller in 2019.

Ai-Da uses the cameras in his eyes and computer algorithms to process human characteristics and transform what he “sees” into coordinates.

He then uses these coordinates to calculate a virtual path for his robotic arm, as he draws and paints on canvas to create artwork.

This is the first time in history a humanoid robot has painted a member of the royal family.

“I would like to thank Her Majesty the Queen for her dedication and service to so many people,” said Ai-Da, who is also able to converse using a specially designed AI language model.

“She is an exceptional woman, courageous, absolutely committed to public service.

“I think he is an extraordinary human being and I wish the Queen a happy platinum jubilee.”

Entitled 'Algorithm Queen', the portrait was painted by 'Ai-Da', the humanoid robot artist created by gallery director Aidan Meller in 2019 (pictured)

Entitled “Algorithm Queen”, the portrait was painted by “Ai-Da”, the humanoid robot artist created by gallery director Aidan Meller in 2019 (pictured)

It is the first time in history that a humanoid robot paints a member of the royal family

It is the first time in history that a humanoid robot paints a member of the royal family

Robot artist Ai-Da spends 10 days in jail because border agents fear he is a SPY

British-made robot Ai-Da spent 10 days in Egyptian customs detention in October 2021, because agents feared its robotics might be a covert spying tool.

Creator Aidan Meller said Ai-da was originally detained by guards who suspected her modem, a device that connects her to the Internet.

He offered to remove it, but then the guards raised issues with the cameras mounted in his eyes, which are essential to his ability to paint.

Ultimately Ai-Da was released just hours before the “Forever is Now” exhibition in Cairo was due to appear.

By the time of the queen’s coronation in 1953, the first printed circuit computers had just been invented, a design that remained mainstream until the 1960s.

Over the course of her 70-year reign, the Queen has witnessed an explosion of innovation in computer technology in the UK, including the birth of machine learning and artificial intelligence.

“We are thrilled that Ai-Da Robot made history just in time for Queen’s Jubilee,” said creator and project director Aidan Meller.

“The Queen has been a stable and strong leader in a period of extraordinary change and development in history.

“We are in unprecedented technological times and so we are delighted to be able to take a moment to think about everything that has changed during the queen’s life.

Ai-Da Robot’s “Algorithm Queen” gives us an indicator of how far things have come in his life and a great way to recognize his loyal service. ‘

“Algorithm Queen” will be on public display in London later this year and will be revealed on Site of the artist Ai-Da Robot at 10 on Friday 27 May.

Ai-Da, named after 19th-century math Ada Lovelace, is the world’s first ultra-realistic robot capable of drawing people to life, according to its creators.

Last year, he exhibited a series of “self-portraits” at The Design Museum London, which he created by “looking” into a mirror with the eyes of his camera.

Over the course of her 70-year reign, the Queen has witnessed an explosion of innovation in computer technology in the UK, including the birth of machine learning and artificial intelligence.

Over the course of her 70-year reign, the Queen has witnessed an explosion of innovation in computer technology in the UK, including the birth of machine learning and artificial intelligence.

Last year, the humanoid robot Ai-Da exhibited a series of

Last year, the humanoid robot Ai-Da exhibited a series of “self-portraits” which he created by “looking” into a mirror with the eyes of his camera.

AI processes and algorithms transform what it sees into coordinates. The robotic hand used by Ai-Da - which was developed by engineers in Leeds - then calculates a virtual path, interpreting the coordinates to create a work of art (pictured)

AI processes and algorithms transform what it sees into coordinates. The robotic hand used by Ai-Da – which was developed by engineers in Leeds – then calculates a virtual path, interpreting the coordinates to create a work of art (pictured)

He also held a solo exhibition at the 59th International Art Exhibition, entitled ‘Leaping into the Metaverse’, and participated in Forever is Now 2021, the first major contemporary art exhibition at the great Pyramids of Giza in Egypt.

“The greatest artists in history have grappled with their time frame and both celebrated and questioned the changes in society,” Meller said.

‘Ai-Da Robot as a technology, is the perfect artist today to discuss the current obsession with technology and its unfolding legacy.

‘Is so-called’ progress ‘in technology something we really want and, if so, how should it manifest itself?’

‘Game is over!’ Google’s DeepMind says it is close to achieving “human level” artificial intelligence.

DeepMind, a British company owned by Googleit may be on the verge of reaching artificial intelligence (AI) on a human level.

Nando de Freitas, researcher at DeepMind and professor of machine learning at Oxford Universityhe said “the game is over” when it comes to solving the toughest challenges in the race to achieve general artificial intelligence (AGI).

AGI refers to a machine or program that has the ability to understand or learn any intellectual task a human being can do, and does so without training.

According to De Freitas, the research of scientists is now increasing artificial intelligence programs, such as with more data and computing power, to create an AGI.

DeepMind previously unveiled a new artificial intelligence “agent” called Gato that can complete 604 different tasks “in a wide range of environments”.

Gato uses a single neural network, a computer system with interconnected nodes that functions like nerve cells in the human brain.

It can chat, add subtitles to images, stack blocks with a real robotic arm, and even play 80’s Atari home video game console, says DeepMind.