Metaculus keeps switching itself back to light mode every so often, which I'm not a fan of. Could this be fixed? Or just make dark mode the default?

@WilliamKiely I probably have Covid (albeit nearly asymptomatic) as I've been isolating with someone who got a positive test result a few days ago.

Don't have a date yet, but I'll report back once I get a test done.

A YouTuber by the name of [BarelySociable](https://www.youtube.com/channel/UC9PIn6-XuRKZ5HmYeu46AIw/videos) claims that Satoshi's real identity is [Adam Back](https://en.wikipedia.org/wiki/Adam_Back) in a three-part, roughly 90 minute documentary. It doesn't provide proof beyond reasonable doubt, but it *does* make a pretty solid case for it being Back in the last part. Overall quite thorough and well-researched. [Part 1](https://www.youtube.com/watch?v=_Kav2K1DVWo) - [Part 2](https://www.youtube.com/watch?v=fMWnaR5uJxQ) - [Part 3](https://www.youtube...

What if North Korea merges peacefully with the South or otherwise ceases to be an independent country?

Feels like it might be a bit unfair to give an ambiguous resolution if the land that used to be North Korean becomes part of a larger democratic state.

@nagolinc

I agree with the "much more livable" part, but physical distance doesn't particularly matter here - travel time does.

In that regard, getting to Mars would take 6-8 months (assuming optimal fuel use + you can only go once every 2 years), while reaching America from Europe via ship would've taken ~4-6 weeks. Still a fair bit slower, but not 40,000 times slower.

  • When will a quantum computer running Shor's algorithm (or a similar one) be used to factor one of the RSA numbers for the first time?
  • When will all of the RSA numbers be factorized?

What if OpenAI release a bigger NLP model, but don't explicitly call it GPT-4? For example, maybe they come out with "GPT-3.5" or "GPT-X" or "Really Big Language Model 1000".

What if it's primarily, but not exclusively an NLP model? E.g. capable of working with text and images in the same neural net, but still overall "better" with text-only than GPT-3.

Coronavirus: Air pollution and CO2 fall rapidly as virus spreads

Levels of air pollutants and warming gases over some cities and regions are showing significant drops as coronavirus impacts work and travel.

There are several options worth taking under consideration: * **Venus.** Possibly the best candidate to have harbored life in the past, but finding any evidence of it today seems unlikely, even with improved technology. There is some possibility of thermoacidophilic [extremophiles](https://www.liebertpub.com/doi/full/10.1089/ast.2017.1783) living in the upper atmosphere, roughly 50 to 65 km above the surface. * **Mars.** Liquid water is widely thought to have existed on Mars in the past, and now can occasionally be found as [low-volume liquid brines](h...

The question resolves negative. It's May now and the highest rated model is Google's T5 at 89.3%.

Shame it got so close, but I guess that's how it goes. It seems the resolution date was set too far ahead as well, so people might get their points truncated if that's not adjusted.

@Jgalt Is it really a World War if it doesn't start on Earth?

I wonder what's the most plausible scenario for this to resolve positively? Here's one attempt: 1. A dozen or so humans are sent to Mars as part of our first crewed mission to the red planet. 2. Soon after landing, some entity back on Earth creates the world's first AGI. It turns out to be misaligned and capable of fast takeoff, meaning it starts turning the entire planet into some equivalent of paperclips in a matter of months or less. 3. The Mars colonists, now the last survivors of humanity until the AI comes for them too, watch in sheer horror. Some...

@fifferfefferfef

The lower bound is ~5000 already, so no reason to go down to 0. I just rounded up to the nearest order of magnitude. People can still put most of their prediction mass below 10k if they choose to do so.

State of AI Report 2020, which successfully made 4.5 out of 6 predictions from last year, predicts a 10 trillion parameter model within the next 12 months.

Going up another order of magnitude in the next ~4 years after that should be pretty straightforward.

OpenAI just released a paper announcing GPT-3, a 175 billion parameter language model.