These discussions probably should involve more philosophy, and maybe even theology, than technical talk. I often refer to AI as alien, as the way it processes and computes data is so foreign to what humans are capable of understanding that we will likely lose control of it at some point. The marketing teams for these initiatives have come up with a plethora of different methods to try and get the masses to accept all of this. Mainly coming in the form of convenience.I find a lot of normalcy bias in these discussions, where people do not perceive previous computing damage to society and do not question if this current technology will be worse…
Some believe this is a stepping stone to utopia. I don't believe man was ever meant to create utopia. We have attempted to achieve this through thousands of years of history, with some of the greatest and brightest empires failing to create the impossible pipe dream of what most would consider utopia. I do not believe a philosophically and morally bankrupt society living on the edge of frivolity and apathy that just so happens to have an abundance of computing power will achieve this utopia any better than previous societies. In all honesty, I believe our society now is probably one of the dumbest in history, both on the mundane and intellectual levels.
Some futurists suggest that AI may eliminate the need to work. Ignoring the massive potential neo-feudalistic implications in that, I would question what if humanity wasn't designed to not work? We'd all like to think that it would allow us to do our own thing - play and study without a care in the world. I think the summer of 2020 is closer to what the reality would look like. Some people need to be busy with work, or else they will be burning down society. Some just need to work, or else they will go crazy, like a lot of people I know who left retirement to go back to work for non-financial reasons. People would likely find less fulfillment in their lives and will likely be depressed and forced even further into apathy.
I don't pretend to have all the answers, however AI is showing itself to potentially usher in a radical change for humanity. Other than some discussions at some universities or gun forums, we really aren't as a society having the serious philosophical discussions of what will happen. I believe there is a likelihood humanity will be very much worse off as a result, however this doesn't stop all these researchers from surging forward dragging us into this Brave New World if we like it or not. Soon we'll be in a digital Dark Forest, where we will have to prove ourselves to be human even in discussions like this. And on the subject of solutions to the Fermi Paradox, what if the uncontrolled creation of AI is the Great Filter?