Daggerfall Unity VR with the Vive Pro

Discuss coding questions, pull requests, and implementation details.
Post Reply
User avatar
Interkarma
Posts: 7236
Joined: Sun Mar 22, 2015 1:51 am

Re: Daggerfall Unity VR with the Vive Pro

Post by Interkarma »

The distinct UI stacks are in place now, but the rendering is still all sent back to the DaggerfallUI.RenderTarget instead of the appropriate target. I'm working on ways to resolve this now.

Once I'm done, it should be trivial to split out things like HUD components (e.g. compass and health bars). But splitting out other items from their position in stack will be more problematic. One problem at a time.

User avatar
InconsolableCellist
Posts: 100
Joined: Mon Mar 23, 2015 6:00 am

Re: Daggerfall Unity VR with the Vive Pro

Post by InconsolableCellist »

Sounds like good progress! Currently I'm looking at how other Unity VR projects handle cursor input with the controller, and I'm trying to use one to see if that'll be useful for input management. If this works I may not even need the input offset coordinates and the other event handling...it looks like these systems are using a UI camera to cast out rays, and then they somehow integrate into the existing UI event system.

I was looking at this one, but I'm not sure if it's too tightly coupled with a Unity UI system that DFU doesn't use (or does it?) https://github.com/VREALITY/ViveUGUIModule

User avatar
Interkarma
Posts: 7236
Joined: Sun Mar 22, 2015 1:51 am

Re: Daggerfall Unity VR with the Vive Pro

Post by Interkarma »

The Daggerfall Unity UI (which I informally call DagUI) doesn't use uGUI at all. It actually started life as a UI system for my long abandoned "Ruins of Hill Deep" project on XNA. It has evolved a long way since then.

I also have a custom InputManager layer that sits on top of Unity's input system. The mouse is sampled using "Input.GetAxisRaw("Mouse X");" and "Input.GetAxisRaw("Mouse Y");". Unless your input system can feed relative coordinates for these back into the Unity input system, you're probably better off with being able to send in custom coords to my InputManager.

This is where things get better though. You can now project the classic UI to any world object running a Canvas and RawImage output. The RawImage can accept raycast hits from your controllers. You should be able to send your ray hit position back into my UI and input system to simulate the mouse from that controller hit.

I don't have a good visual understanding of what you're planning however. I was imagining something like the Steam UI that pops up when you hit the system menu on a controller. The game is paused and blurred, and a world object is placed in front of player like a theatre screen. This would host the classic UI and you can control UI by pointing, scrolling, and clicking like you would with the mouse now. Closing UI will resume the game.

Other elements like the health bars and compass can be disabled on the standard HUD and rehosted on diegetic floating canvases that move along with player.

User avatar
InconsolableCellist
Posts: 100
Joined: Mon Mar 23, 2015 6:00 am

Re: Daggerfall Unity VR with the Vive Pro

Post by InconsolableCellist »

Interkarma wrote: Mon Jun 04, 2018 3:54 am The Daggerfall Unity UI (which I informally call DagUI) doesn't use uGUI at all. It actually started life as a UI system for my long abandoned "Ruins of Hill Deep" project on XNA. It has evolved a long way since then.

I also have a custom InputManager layer that sits on top of Unity's input system. The mouse is sampled using "Input.GetAxisRaw("Mouse X");" and "Input.GetAxisRaw("Mouse Y");". Unless your input system can feed relative coordinates for these back into the Unity input system, you're probably better off with being able to send in custom coords to my InputManager.
This is actually something of a relief, because integrating these existing systems isn't all that easy. I've seen advice to use VRTK (though I discovered it's recently discontinued) and it throws on so many layers that require integration that I'm relieved it's not an option anymore.
Interkarma wrote: Mon Jun 04, 2018 3:54 am This is where things get better though. You can now project the classic UI to any world object running a Canvas and RawImage output. The RawImage can accept raycast hits from your controllers. You should be able to send your ray hit position back into my UI and input system to simulate the mouse from that controller hit.
This sounds great!
Interkarma wrote: Mon Jun 04, 2018 3:54 am I don't have a good visual understanding of what you're planning however. I was imagining something like the Steam UI that pops up when you hit the system menu on a controller. The game is paused and blurred, and a world object is placed in front of player like a theatre screen. This would host the classic UI and you can control UI by pointing, scrolling, and clicking like you would with the mouse now. Closing UI will resume the game.
That's essentially what I'm picturing as a first pass, just to make the game playable. But I figured I'd have the UI floating in the world, drawn above every other element, and the world would be paused. As a second pass, I'd like to make some windows not pause the game, and you can reach out and reposition them, and then they'd anchor in that position as you move around. Think of having a little inventory window or a journal stuck somewhere convenient.

As a third pass, I want to have common UI elements integrated with the player's virtual body and controllers, so you can turn a controller upside down and see a compass floating there, or have a spellbook in a hand so you can quickly use it in combat.

Other elements like the health bars and compass can be disabled on the standard HUD and rehosted on diegetic floating canvases that move along with player.

User avatar
InconsolableCellist
Posts: 100
Joined: Mon Mar 23, 2015 6:00 am

Re: Daggerfall Unity VR with the Vive Pro

Post by InconsolableCellist »

Where should I look for passing in the custom coordinates? I've been reading your recent commits but they seem to be related to the stack improvements. Perhaps I missed this aspect.

User avatar
Interkarma
Posts: 7236
Joined: Sun Mar 22, 2015 1:51 am

Re: Daggerfall Unity VR with the Vive Pro

Post by Interkarma »

InconsolableCellist wrote: Mon Jun 04, 2018 4:15 am Where should I look for passing in the custom coordinates? I've been reading your recent commits but they seem to be related to the stack improvements. Perhaps I missed this aspect.
I haven't done that for you yet, I'm sorry. I needed to solve some other items first. The good news is I'm nearly there. I'm about to post a quick demo of how to add the compass to a custom floating render target instead of being drawn on the usual HUD layer.

Note that I'm not doing this in VR, but the principal is the same. I'll start work on coords after this bit.

Edit: Have found a blocking issue for my compass demo. Will work this out and update when I can.

User avatar
Nystul
Posts: 1501
Joined: Mon Mar 23, 2015 8:31 am

Re: Daggerfall Unity VR with the Vive Pro

Post by Nystul »

When designing the automap windows I thought about the case what would happen if you would have window open while game would run unpaused. I decided to go for the easy way and just assume game is always paused... We would have to rewrite some of its logic if we need this feature. Let me know what your plans are - so we might be able to work something out

User avatar
Interkarma
Posts: 7236
Joined: Sun Mar 22, 2015 1:51 am

Re: Daggerfall Unity VR with the Vive Pro

Post by Interkarma »

Yep, and a lot of UI components (and I mean a lot) assume Screen.width and Screen.height as the output. This is fine (for now) and the way I've setup the offscreen RenderTexture is compatible with this (for now). But it's not going to last forever. Ideally the panels will be aware of their own offscreen target dimensions and we won't need to use Screen.width/height at all.

For now it's just baby steps. I'm solving problems one small piece at a time. Until then, nothing else changes in core. :)

User avatar
Interkarma
Posts: 7236
Joined: Sun Mar 22, 2015 1:51 am

Re: Daggerfall Unity VR with the Vive Pro

Post by Interkarma »

Summary

Note: Requires latest code from git if you don't have it already.

Following is an example of how to add the compass into world as it's own diegetic object rather than outputting to the non-diegetic HUD overlay. This tutorial assumes you have already disabled the RawImage component on NonDiegeticUIOutput object to render the overlay UI non-visible.

Just to set expectations, there are some hacky bits to how this works. When using DrawTexture() to an offscreen RenderTexture, Unity likes to scale the rendering by Screen.width and Screen.height dimensions regardless of the actual RenderTexture dimensions. Internally, I'm using the panel scaling logic to compensate. I literally have no idea how well this will work in VR, but we have to start somewhere.

Code

Once you've cleared the standard UI, first step is to create a new C# script called CustomCompass.cs and copy in the following code. The basic theory is that each UserInterfaceRenderTarget can work like it's own independent UI stack with nested panels. You can have as many of these floating around as you like and put in whatever DagUI controls you want. Just keep in mind that most full scale UI windows (inventory, automap, etc.) aren't designed to run while the game is executing. But the HUD components are designed this way so they're perfect to test with.

Code: Select all

using UnityEngine;
using DaggerfallWorkshop.Game.UserInterface;

[RequireComponent(typeof(UserInterfaceRenderTarget))]
public class CustomCompass : MonoBehaviour
{
    UserInterfaceRenderTarget ui;
    HUDCompass compass;

    private void Start()
    {
        // Setup offscreen UI - compass frame is 69x17 pixels
        ui = GetComponent<UserInterfaceRenderTarget>();
        ui.CustomWidth = 69;
        ui.CustomHeight = 17;

        // Create HUD compass and add to offscreen UI parent panel
        compass = new HUDCompass();
        ui.ParentPanel.Components.Add(compass);
    }

    private void Update()
    {
        // Scale compass to parent
        compass.Scale = ui.ParentPanel.LocalScale;
    }
}
World Object

Next step is to create an empty GameObject. I positioned this one underneath SmoothFollower, but you can place however you like in the world. Call this object CustomCompass also and add the following components.
  • Canvas
  • RawImage
  • UserInterfaceRenderTarget
  • CustomCompass
Configure in Inspector and copy all settings so the object properties look like below. This assumes CustomCompass is parented to player in some way.

custom-compass-inspector.JPG
custom-compass-inspector.JPG (89.41 KiB) Viewed 2613 times

When you run the game, the floating compass should be visible around waist level like below. The compass object is now inside the world with player rather than in the HUD overlay.

Image


Improvements

This simple setup isn't without some limitations. The object is subject to lighting conditions and will clip through other geometry. A good improvement will probably be to setup a camera stack and render these components from their own camera so you can control presentation under all conditions.

This is really just a proof of concept, but it represents a ton of work in the UI back-end to make this possible. Hopefully it works just as easily for you in the VR environment.

Once we know this works, then I'll get cracking on pointer input support for you.

User avatar
InconsolableCellist
Posts: 100
Joined: Mon Mar 23, 2015 6:00 am

Re: Daggerfall Unity VR with the Vive Pro

Post by InconsolableCellist »

Excellent! Thanks for the detailed post. I'll be able to get to this later and I'll do a little demo

Post Reply