How close are we to a singularity? Well, computers started being able to think faster than us in the 1990s, (neurons have a processing speed of 200 Hz), and are now many orders of magnitude faster.
"But wait!" you reasonably object, "that's only one aspect of computer power! What about hard drive storage, and how does working memory compare to RAM?"
I'm not sure how much memory the human brain can hold. Scientific American says it can hold 2.5 petabytes, and this figure seems to be the most heavily cited among pop science articles, but they don't really explain their numbers. AI Impacts, which does seem to be much more mathematically rigorous in their articles and usually shows their mathematical process, claims that "Most computational neuroscientists tend to estimate human storage capacity somewhere between 10 terabytes and 100 terabytes" (about 1/10th of a petabyte). Humans are very unlikely to have the most efficient possible algorithm, of course, and it's possible that 1 terabyte is more than enough for a perfectly-optimized general intelligence. But still, let's assume for the sake of argument that the highest estimate of 2.5 petabytes is correct and is the absolute minimum needed for a human-level or higher intelligence. How does that compare to the current state of computers?
The internet has been estimated to contain hundreds of exabytes of data... in 2007! A 2009 article said that Google alone had about an exabyte of data. In 2013 Randall Munroe (of xkcd) estimated that Google had 10 exabytes. If the whole internet is added up in 2020, the estimates range in the double digits of zettabytes. Each exabyte is a thousand petabytes and each zettabyte is a thousand exabytes. This memory is unavailable and used for other purposes, but if any big company knew how to program a superintelligence it's easy to see that they'd be able to build a "mere" 2.5 petabyte server building. Some botnets also probably contain this amount of hardware already. Memory would need to be distributed in some way, but there are already some methods to do that used for things like archives and youtube. Computers clearly have enough physical memory for a superhuman intellegence.
What about RAM? This is so different from human working memory that a direct comparison can't really be made. Humans can analyze a very complicated ongoing event based on vision and sound in real time, but (usually) can't multiply two floating point numbers in their mind. One thing that can be used for comparison, though, is self-driving cars.
There are already hundreds of self-driving cars and taxis on the road that have a good enough safety record to continue functioning in Arizona. There's an occasional collision, such as the high-profile lethal hit of a female pedestrian that was publicized to the point where Uber stopped testing autonomous cars for a while, and which seems to be mentioned every time there's an article criticizing the safety of self-driving cars. That was tragic, of course, but we need to be fair in our comparison. If a human driver hit someone who was crossing the street outside of a crosswalk at night, nobody would tell the news and the driver might not even be considered at fault. There have been other incidents where autonomous cars were caught breaking the law, but when you compare them to the rate that human drivers break the law it's not so bad. In fact, some reports say that they are already safer than humans. I'd say based on this that computer RAM can function at least as well as human working memory at spatial tasks which humans are good at if it has a good algorithm, and RAM is much better than working memory at anything involving computation.
Here's a summary of the situation right now. Clock speed is orders of magnitude better than humans'. Memory is at least as much and probably a couple of orders of magnitude more than humans' (and will gain 5-6 orders of magnitude more if it takes over the internet). RAM is about as good at some things and far better at other things than humans' equivalent. If connected to the internet, an AI would have access to far more information than a human does, especially once it figures out how to hack cameras. Computer power is still improving each year. The only reason the singularity hasn't happened yet is that humans don't understand intelligence enough to create even a human-level one. In a physics metaphor, a 10,000kg pile of enriched uranium can fission at any time, but it won't until the first free neutron appears and starts the reaction.
Recently there was a post called Fun with +12 OOMs of Computation, where people imagine things that could happen if suddenly computation power, memory, and things like that were multiplied by 10^12. There was a poll near the end asking people to estimate the chance of the singularity happening by the end of the year, based on their mental model of reality without considering things like peer beliefs. Most people put things like 95%, believing that computation power is the main barrier preventing hyper exponential growth. The problem with their models, as I see it, is that we already have +several OOMs of Computation more than needed. If computer power is the only thing standing between us and the singularity then we will finally have enough computer power... a decade ago.
I know realizing that the deadline for AI alignment is "now" can be stressful, but there is a bit of good news about this. If figuring out a good algorithm is the remaining challenge, it is very unlikely to succeed unless people really know what they're doing. We are probably safe from people doing things like trying brute force methods or evolving big, probably unaligned neural nets.