Posts

Sorted by New

Wiki Contributions

Comments

Humanity seems, historically to have tended toward believing that technology will eventually "save" them from whatever horrors reality may present and whatever awful futures seem inevitable. It's always been the case but, increasingly, technology is being considered as a threat to humanity, a crutch, or something alltogether evil. If we take the position that technology will eventually kill us, then we might as well use the resources we have left to go on one last gasser, right? And if it will save us we may as well use our remaining resources on a quest to find that saving tech, the final piece, right? Or maybe there's a way we can live with it, and allow it to live with us, meaning ai as the culmination of technology - we don't even have to think anymore, just say or gesture what we'd like and the machines will make it so. That, however, sounds quite dull. I think we must establish democratically established values in place to guide the development and implementation of technology - not the power and profit driven whims of governments and corporations. I suppose, as it stands now, tech won't kill us because if it did, we couldn't buy new tech or vote.