Paintings come to life [message #96010] |
Thu, 22 September 2022 09:26 |
Rusty
Messages: 1205 Registered: May 2018 Location: Kansas City Missouri
|
Illuminati (3rd Degree) |
|
|
I saw this the other day on the Youtube. It's really kind of touching to see this A.I. rendering of the subjects of old paintings, with the coup de grace being The Mona Lisa by Leonardo da Vinci come to life as real people. Unromanticized. Smiling and blinking their eyes at us. Amazing testament to the power of what the technology can provide.
https://www.youtube.com/watch?v=bXCYBsG6ork
Abe Lincoln given the same treatment. Even made to look contemporary.
https://www.youtube.com/watch?v=LgbwNZ-piZs
|
|
|
Re: Paintings come to life [message #96011 is a reply to message #96010] |
Thu, 22 September 2022 11:44 |
|
Wayne Parham
Messages: 18793 Registered: January 2001
|
Illuminati (33rd Degree) |
|
|
That's so cool, Rusty! Thanks for posting that here!
I remember the first computer-generated photo-realistic images I saw. They were done by ray-tracing software. It wasn't long ago, really, being in the 1990s.
Just before then - in the 1980s - there were what I would call close-to-realism renderings. Mostly done using techniques that could provide shading and highlighting of fairly primitive 3D models. I'm talking about images like you saw in the original TRON movie.
By the 1990s, you could render something that looked realistic using ray-tracing software. Ray-tracing software uses 3D models with a virtual camera and light sources. The 3D models not only describe the shape of an object, but also its surface characteristics like texture, color and reflectivity.
Movement modeling was also going through research and improvements in the 90s. The earliest animations tracked the position of each object independently. So every frame had the X/Y/Z coordinates of every object, and you manipulated them independently. Later improvements included being able to "attach" parts to one another with simulated "joints" that only allowed certain degrees of freedom. That made realistic motion modeling easier. Especially when a physical subject was filmed and its motions captured, and then transferred into the computer model.
In the 2000s, I started to see photo-realistic computer models of people. That took things to a whole new level. The second TRON movie is an example. The rendering of the young Jeff Bridges was outstanding. Now you sometimes almost can't tell if people in movies are actual humans or CGI.
So this project - the subject of your thread - is very interesting to me. Taking 2D images from paintings of famous historical figures and using them to create 3D models makes them come back to life! We can come much closer to really seeing what they looked like.
|
|
|
Re: Paintings come to life [message #96012 is a reply to message #96011] |
Thu, 22 September 2022 12:13 |
Rusty
Messages: 1205 Registered: May 2018 Location: Kansas City Missouri
|
Illuminati (3rd Degree) |
|
|
I remember training on the ct scanner at work. We had some volunteers from the hospital staff that allowed us to scan them for our protocols indoctrination. A maintenance man had us do his facial bones. He had a deviated septum. But when we did some various image algorithms showing his skull and facial bones it was evident who it was. It was Fred the maintenance man. His facial bones looked just like him. But he found it a little unnerving to see his damn skull staring back at him beneath the full flesh mug he usually saw in the mirror. He didn't want a copy of the scan. Made me appreciate this technology though and how sci-fi advanced it seems.
My nephew did a AI rendering of his great grandfather, my grandfather from a photograph. Showed him turning his head slightly and blinking like in the painting AI renderings. It brought him back to life in it's strange and wonderful way.
|
|
|