Urge to code, rising... (A thread about coding)

Off topic discussion. Talk about gaming and life in general. Be awesome to each other.
Post Reply
Narf the Mouse
Posts: 833
Joined: Mon Nov 30, 2015 6:32 pm

Urge to code, rising... (A thread about coding)

Post by Narf the Mouse »

So we don't have a thread about coding. But we talk about coding. So here's a thread about coding. :)

So I'm going to expound on something programming WRT coordinate system handling. tl;dr - You probably don't want to do this. It's overkill with a nuclear-powered sledgehammer. Also, Unity is inconsistent over whether or not it draws what it considers "unreasonably large or distant" objects. But It's been a long time since I wrote a long post on something programming, and this is useful if you ever want to, say, write a game called, say, "Star Dangerous" or something. Or the next SurvivalCraft. Or are writing your own game engine. *cough*

Upside: Using just integers, it allows a map size of 119 light-minutes. Using long integers, it can be easily expanded to 974,904 light-years when using Unity's float32 vectors. Downside: You need to write your own coordinate system for which Unity's is a mere back-end, and also requires writing your own physics engine hooks.

The system: It looks something like floating-origin. Every time any discrete, top-level entity (such as the spaceship, but not the guns on the spaceship) moves more than 1,000m away from the origin, it's re-centered. Except rather than being re-centered with respect to some other top-level entity, the re-centering is entirely internal.

Inside that script, you maintain three integers or long integers named something like GlobalX, GlobalY, GlobalX, and a Vector3 named something like Local. Then, to get the position of anything, you just subtract your GlobalXYZ from the center top-level entity's GlobalXYZ, and the same for the LocalXYZ's. Then you add the resulting GlobalXYZ * 1000.0F to the resulting Vector3. Then you rotate by the inverse of the center-point's rotation.

Simple. Everything's in the exact right position, and everything displays correctly. Right, my job's done here.

...Orrr maybe I should demystify that with some code. :lol: Code is in Unity Standard Old C#, as the last I've heard, the Net Standard 2.0 was still experimental, and not everyone can update.

Code: Select all

using System;
using UnityEngine;

public class UniverseCoordinates : MonoBehaviour
{
	[SerializeField]
	private int globalX, globalY, globalZ;
	[SerializeField]
	private Vector3 local;
	[SerializeField]
	private Quaternion orientation;

	private void Update()
	{
		// Do stuff that changes local.

		// Maybe use an accumulative adder, so that small values are kept.
		// Maybe keep track of any precision not added.
		
		int globalMoveX += ((int)local.x / 1000);
		int globalMoveY += ((int)local.y / 1000);
		int globalMoveZ += ((int)local.z / 1000);
		
		local.x -= globalMoveX * 1000.0F;
		local.y -= globalMoveY * 1000.0F;
		local.z -= globalMoveZ * 1000.0F;
		
		globalX += globalMoveX;
		globalY += globalMoveY;
		globalZ += globalMoveZ;
	}
	
	public static Coord operator-(UniverseCoordinates lhs, UniverseCoordinates rhs)
	{
		int globalDiffX = lhs.globalX - rhs.global.X;
		int globalDiffY = lhs.globalX - rhs.global.Y;
		int globalDiffZ = lhs.globalX - rhs.global.Z;
		
		Vector3 localDiff = lhs.local - rhs.local;
		Vector3 displacement =
			localDiff +
			new Vector3(
				globalDiffX * 1000.0F,
				globalDiffY * 1000.0F,
				globalDiffZ * 1000.0F
				);
		
		// With this, we can act like our center is pointing straight forward.
		// This also means our view matrix doesn't need to rotate.
		return
			new Coord(
				Quaternion.Inverse(rhs.orientation) * displacement,
				lhs.orientation
			);
	}
}

[RequireComponent(typeof(UniverseCoordinates))]
public class CenterCoordinates : MonoBehaviour
{
	private static UniverseCoordinates _CENTER;

	public static UniverseCoordinates CENTER
	{
		get { return _CENTER; }
		set
		{
			// This guard formation handles the _CENTER being set after we check.
			if (_CENTER != null)
			{
				lock (_CENTER)
				{
					if (_CENTER != null)
					{
						_CENTER = value;
					}
					else
					{
						throw new ArgumentOutOfRangeException("Too many centers.");
					}
				}
			}
		}
	}

	private void Start()
	{
		CENTER = GetComponent<UniverseCoordinates>();
	}
}

public struct Coord
{
	// You shouldn't need to change this.
	public readonly Vector3 Displacement;
	// You shouldn't need to change this.
	public readonly Quaterion Orientation;

	public Coord(Vector3 displacement, Quaternion orientation)
	{
		Displacement = displacement;
		Orientation = orientation;
	}
}
Previous experience tells me it's very easy to misunderstand the tone, intent, or meaning of what I've posted. If you have questions, ask.

User avatar
Interkarma
Posts: 7236
Joined: Sun Mar 22, 2015 1:51 am

Re: Urge to code, rising... (A thread about coding)

Post by Interkarma »

Hey Narf welcome back! :)

This is very close to how things work in Daggerfall Unity. I use a dual coordinate system with integers for the larger world and floats for local space. There's also a displacement system.

Something I had to work around in Daggerfall Unity was not being able to use Y (up) reliably in the integer coordinate system. Verticality is very flexible in DFU as mods can change terrain sampling behaviour to make small hills into mountains. When the terrain sampler changes I have to place player back on the ground.

This flexibility gave me some troubles when I finally implemented floating Y to the game. In had to account for differences not only with compensation but between different save versions of floating origin system and different terrain samplers as well. At the end I was able to make everything pretty much "just work" with a couple of edge cases I'm not ready to tackle and have no meaningful impact on game.

Narf the Mouse
Posts: 833
Joined: Mon Nov 30, 2015 6:32 pm

Re: Urge to code, rising... (A thread about coding)

Post by Narf the Mouse »

Thanks!

Cool. I haven't actually looked into how DFU handles coordinates, but from what you've said in the past, I gathered it was fairly complex. :)

Floating origin without integer coordinates would handle the later games in the series just *fine, but it makes sense that a game with a landmass [pedantically correct] almost the size of Great Britain [/pedantically correct] would require more finesse in the area of coordinates.

And kudos for implementing it. :)

* Unless the NPCs are more dynamic than I think they are - I am fairly sure that NPCs out of the game's "load range" are just approximated, or basically "not there" in Morrowind.
Previous experience tells me it's very easy to misunderstand the tone, intent, or meaning of what I've posted. If you have questions, ask.

Narf the Mouse
Posts: 833
Joined: Mon Nov 30, 2015 6:32 pm

Re: Urge to code, rising... (A thread about coding)

Post by Narf the Mouse »

The key difference between this and floating origin, which I might not have communicated well (so I will perhaps needlessly try to clarify), is that it does away with any need for a floating origin - Although the player's camera is still the center-point, each object has its own position entirely distinct from, and unrelated to, that camera position. It makes it entirely viable for a random spaceship at Alpha Centauri to be tracked with the exact same precision as another spaceship in orbit around Jupiter, when the player is at Tau Ceti. Edit: And, I should note, at the same time the (0, 0, 0) point is the galactic center.

Perhaps I confuse by calling the camera the "Center". It isn't, really; except for the purpose of drawing stuff on the screen. "Visual center", perhaps? Coding for multiple cameras would allow "multiple subjective visual centers".

If this was clear, my apologies; I wasn't certain I'd communicated it well. :)

Edit: Various edits.
Previous experience tells me it's very easy to misunderstand the tone, intent, or meaning of what I've posted. If you have questions, ask.

Narf the Mouse
Posts: 833
Joined: Mon Nov 30, 2015 6:32 pm

Re: Urge to code, rising... (A thread about coding)

Post by Narf the Mouse »

Apologies; insomnia garbled my brain-thoughts.

I was able to use a comparable setup to avoid the need for a floating-origin in the tech testbed I used to develop the idea, by constructing the render matrix for each mesh from its coordinates relative to the camera. However, Unity does not have built-in support for this sort of thing (although maybe the scriptable render pipeline might be able to?), so you still need some form of floating origin with Unity.

Hopefully that makes it more clear; I proved last night, and once again, that there's no such thing as a quick game of Stellaris when I wondered why the room was getting lighter and heard birdies. Fortunately, I did get some sleep after that.
Previous experience tells me it's very easy to misunderstand the tone, intent, or meaning of what I've posted. If you have questions, ask.

Post Reply