N+1

Enough.

There are too many problems in this world attributed to our desire to not think N+1. Hence. Here I fucking go. WE BREAK THIS SHIT OUT ON PURPOSE.

WHAT IS N+1

N+1 is the idea that you need to fucking Dimensional Expansion. YOU NEED TO LIFT THE DATASET, AND YOU NEED TO CONSIDER SHIT, that you probably oftentimes don't. YA HURD?

This means so much, and it's what differentiates us from our human simulators, at least as of now, in 2025.

HOW THE FUCK DO WE FIX THIS?! , well, in AI, we are going to need to let it, rearrange it's own fucking dimensionality, and it's understanding of how it needs to schema switch. so fucking get on it already. I dont know the math but this shit clear as day. but also if we keep adding n+1 dimensions eventually we hit the point where theres not enough time to actually calibrate the existing dimensions. imagine head perpetually in the clouds...

please explain it again that means nothing

DUH, yeah, Plato's Cave or whatever. How the fuck are you gonna realize the things I want to say, or am saying without acknowledging what this is. How can you apply this, which you do already and you just don't talk about. THIS IS IMPLICIT IN WHAT WE DO. Hofstadter says it and calls it "Jumping out of the game."

YES YOU need to remove yourself, follow The alienation effect, and this should come naturally. YOU need to distance yourself from the initial thoughts and then be able to analyze the different things and different ways you could approach this idea from a different standpoint, do some Dimensional Expansion. You can do it. Go for it!

in ai?

Because this is what is separating us. We can do both Dimensional Expansion, and Dimensionality Reduction on a fucking dime. And any computers that we got to do it, we hard coded that shit, so at the end of the day it's still playing our game. So are we by the way. No matter how many dimensions we try to pull ourselves out of, no matter how many layers we try to remove ourselves of, we are constantly, so fucking constantly Playing the Game. And that means, just as in Gabriel's Horn, we will never be able to reach that level of understanding which is required of us to understand the Reality Manifold. And that's okay. Because at the end of the day, We must imagine Sisyphus as Happy.

Conclusion

So i don't wanna hear anything more about problems that could easily be pulled together about N+1, or Dimensional Expansion or the lack thereof, or what separates AI from thinking like us and how we can get it to, because one thing we need to do is feed the idea of. "RUN D+1", "RUN N+1" and think fucking properly about this please. You need it to metathink, or at least simulate metathinking to get it even a little close to the AI that would act or produce any meaningful results.

That and minimizing uncertainty over loss and other ideas found in Next Steps for Machine Learning. There's quite a bit we are missing, but I'm gonna chalk it out to me being a dumbass and iq lower than 145 and thats why i cant formalize it. BUT THAT WONT STOP ME FROM TRYING. fuck yeah i will.