Kean Walmsley


  • About the Author
    Kean on Google+

August 2014

Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31            








« Creating a 3D viewer for our Apollonian service using HTML5 – Part 3 | Main | Heading for San Francisco »

May 25, 2012

Creating a 3D viewer for our Apollonian service using WinRT – Part 1

After tackling the implementation of a basic 3D viewer for our Apollonian web-service using a variety of technology stacks – AutoCAD, Unity3D, Android, iOS & HTML5/WebGL – I felt as though I really needed to give it a try with WinRT, the new runtime powering Windows 8.

All of the previous stacks had some “object” layer I could use above the base graphics engine – Rajawali provided it for Android/OpenGL ES, iSGL3D for iOS and Three.js for HTML5/WebGL – but for WinRT all bets were off. The general guidance for developing Metro-style 3D applications (typically that means games) is to use DirectX 11 (i.e. Direct3D) from C++.

I found two managed wrappers to allow me to use DirectX from C#: it seems SlimDX is the established player, and SharpDX the new kid on the block (although as it is apparently auto-generated using DirectX headers it has broad coverage, already). I went with SharpDX, as it was the only one that worked – at the time of writing – with WinRT to allow creation of Metro-style apps. Both SlimDX and SharpDX have pretty sparse documentation and samples, unfortunately, but you take what you can get. And neither is intended to provide an object framework that I’d been spoiled with on other platforms: perhaps they’re out there or in development (there seems to be some excitement around ANX.Framework, which is apparently an open source replacement for Microsoft’s popular XNA, but it seems a ways off being useable).

So I had to roll up my sleeves and wade into low-level graphics implementation – something I’d explicitly avoided when working with OpenGL ES.

The first thing to take care of was building the polygons representing a sphere: something I managed to find in this C++ DirectX sample. I combined that with much of the code in the MiniCube and MiniCubeXaml SharpDX samples to create a Metro-style 3D viewer for our Apollonian models.

That’s the quick version of the story: the longer version is a bit more detailed, as I had to learn all about buffers and shaders. The Direct3D pipeline is such that you need to create a vertex shader, which takes data in the form of buffers that get loaded onto the GPU and outputs data to be passed on to the pixel shader, which then calculates the color of a specific vertex based on the data it receives and lighting, etc. Nasty, low-level stuff (for me, at least). I had to spend some time understanding the basics of HLSL, the C-like language you use to write these shaders before they get compiled into code the GPU can execute.

And it took me some time to find out that the Direct3D feature level supported by my graphics card – probably influenced by the Parallels virtualization layer between Windows 8 and my Mac hardware – influenced the way I had to compile the shaders: the app now targets a minimum of Level 9.3, which presumably means that compiling for a higher version would give better performance on different hardware.

Anyway, by this point I had a simple, shaded, rotating sphere. Then I had to learn about – and implement – instancing, as one sphere wasn’t enough and we clearly didn’t want to have to generate a mesh for each of our spheres. This took me some time to get right – my app would work well until I tried to access some of the instance-specific data in my vertex shader, at which point I’d just get no graphics at all. Eventually I understood that there was a different version of the InputElement() constructor that took arguments allowing definition of instance-specific data – we needed this to define the position, radius and colour of each sphere.

While I was hitting this particular wall, I was excited to find out about the new Graphics Debugger in Visual Studio 11. This is enabled automatically for C++ projects, but when working with managed code – via SlimDX or SharpDX – it takes more work to get working. I found a few forum posts explaining an approach involving creating a separate EXE and then a C++ project that launches it rather than the debugger. Neither of which helped me diagnose the issue – or even led to the Graphics Debugger displaying any useful data, as it didn’t let me capture a frame, for some reason – but it was an interesting process.

Once I finally had instancing working, it was then a simple matter of plugging together some code to access our web-service and extract the sphere information from our JSON results.

Here’s the resulting application in action:

Unable to display content. Adobe Flash is required.

Here’s the VS11 project. The main two source files are SimpleSphere.fx – the HLSL source for the vertex and pixel shaders…

float4x4 worldViewProj;

 

struct VS_IN

{

  float4 pos : POSITION;

  float3 norm : NORMAL;

  float3 instancePos : TEXCOORD0;

  float4 col : TEXCOORD1;

  float rad : TEXCOORD2;

};

 

struct PS_IN

{

  float4 pos : SV_POSITION;

  float3 norm : NORMAL;

  float3 worldPos : POSITION0;

  float4 col : COLOR0;

  float rad : TEXCOORD0;

};

 

PS_IN VS( VS_IN input )

{

  PS_IN output = (PS_IN)0;

 

  float4 temp = input.pos;

  temp.w = 1.0f;

 

  // Update the position of the vertices based on the

  // data for this particular instance

 

  temp.xyz *= input.rad;

 

  temp.x += input.instancePos.x;

  temp.y += input.instancePos.y;

  temp.z += input.instancePos.z;

 

  output.worldPos = temp.xyz / temp.w;

  temp = mul(temp, worldViewProj);

  output.pos = temp;

  output.norm = input.norm;

  output.col = input.col;

  output.rad = input.rad;

  return output;

}

 

float4 PS( PS_IN input ) : SV_Target

{

  float3 LightDirection = float3(0, 0, 1);

  float3 AmbientColor = float3(0.43, 0.31, 0.24);

  float3 LightColor = 1 - AmbientColor;

  float SpotRadius = 50;

 

  // Basic ambient (Ka) and diffuse (Kd) lighting from above

 

  float3 N = normalize(input.norm);

  float NdotL = dot(N, LightDirection);

  float Ka = saturate(NdotL + 1);

  float Kd = saturate(NdotL);

 

  // Spotlight

 

  float3 Vec = input.worldPos;

  float Dist2D = sqrt(dot(Vec.xy, Vec.xy));

  Kd = Kd * saturate(SpotRadius / Dist2D);

 

  // Diffuse reflection of light off ball

 

  float Dist3D = sqrt(dot(Vec, Vec));

  float3 V = normalize(Vec);

  Kd +=

    saturate(dot(-V, N)) *

    saturate(dot(V, LightDirection)) *

      saturate(input.rad / Dist3D);

 

  // Final composite

 

  float3 Color =

  input.col.xyz * ((AmbientColor * Ka) + (LightColor * Kd));

  return float4( Color * 0.7f, 1 );

}

… and ApollonianRenderer.cs, the C# source for our main graphics implementation:

using System;

using System.Diagnostics;

using System.Collections.Generic;

using System.Net.Http;

using System.Threading.Tasks;

using Windows.Data.Json;

using CommonDX;

using SharpDX;

using SharpDX.Direct3D;

using SharpDX.Direct3D11;

using SharpDX.DXGI;

using SharpDX.IO;

using Buffer = SharpDX.Direct3D11.Buffer;

using Device = SharpDX.Direct3D11.Device;

 

namespace ApollonianViewer

{

  // We allow the UI to register an event to be called

  // once a level has been loaded completely

 

  public delegate void RenderInitCompletedEventHandler(

    object sender, EventArgs e

  );

 

  public class ApollonianRenderer : Component

  {

    Device _device;

 

    // Our core geometry data

 

    BasicVertex[] _vertices;

    short[] _indices;

    private int _vertCount;

    private int _idxCount;

 

    // Our members to map this data for the GPU

 

    private InputLayout _layout;

    private Buffer[] _vertBufs;

    private int[] _vertStrides;

    private int[] _vertOffsets;

    private Buffer _idxBuf;

    private Buffer _constBuf;

    private int _instCount;

    private Stopwatch _clock;

    private VertexShader _vertShader;

    private PixelShader _pxlShader;

 

    // The current level we're rendering

 

    private int _level;

 

    // Some structs for our vertex and sphere information

 

    struct BasicVertex

    {

      public Vector3 pos;  // position

      public Vector3 norm; // surface normal vector

    };

 

    struct SphereDefinition

    {

      public Vector3 instancePos;

      public Vector4 col;

      public float rad;

    }

 

    // Initializes a new instance of SphereRenderer

 

    public ApollonianRenderer(DeviceManager devices, int level)

    {

      Scale = 1.0f;

      _level = level;

      _instCount = 0;

 

      // We need only create the geometry for a sphere once

 

      CreateSphereGeometry(out _vertices, out _indices);

      _vertCount = _vertices.Length;

      _idxCount = _indices.Length;

    }

 

    public float Scale { get; set; }

 

    // When the level is changed, we actually need to query

    // the web-service and process the returned data

 

    public int Level

    {

      get

      {

        return _level;

      }

      set

      {

        if (_level != value)

        {

          _level = value;

          CreateInstancesForLevel(_device, _level);

        }

      }

    }

 

    // Our event for the UI to be told when we're done loading

 

    public event RenderInitCompletedEventHandler OnInitCompleted;

 

    // Create the geometry representing a sphere

 

    static void CreateSphereGeometry(

      out BasicVertex[] vertices,

      out short[]indices

    )

    {

      // Determine the granularity of our polygons

 

      const int numSegs = 32;

      const int numSlices = numSegs / 2;

 

      // Collect the vertices for our triangles

 

      int numVerts = (numSlices + 1) * (numSegs + 1);

      vertices = new BasicVertex[numVerts];

 

      for (int slice=0; slice <= numSlices; slice++)

      {

        float v = (float)slice / (float)numSlices;

        float inclination = v * (float)Math.PI;

        float y = (float)Math.Cos(inclination);

        float r = (float)Math.Sin(inclination);

        for (int segment=0; segment <= numSegs; segment++)

        {

          float u = (float)segment / (float)numSegs;

          float azimuth = u * (float)Math.PI * 2.0f;

          int index = slice * (numSegs + 1) + segment;

          vertices[index].pos =

            new Vector3(

              r * (float)Math.Sin(azimuth),

              y,

              r * (float)Math.Cos(azimuth)

            );

          vertices[index].norm = vertices[index].pos;

        }

      }

 

      // Create the indices linking these vertices

 

      int numIndices = numSlices * (numSegs-2) * 6;

      indices = new short[numIndices];

 

      uint idx = 0;

      for (int slice=0;slice<numSlices;slice++)

      {

        ushort sliceBase0 = (ushort)((slice  )*(numSegs+1));

        ushort sliceBase1 = (ushort)((slice+1)*(numSegs+1));

        for (short segment=0;segment<numSegs;segment++)

        {

          if(slice>0)

          {

            indices[idx++] = (short)(sliceBase0 + segment);

            indices[idx++] = (short)(sliceBase0 + segment + 1);

            indices[idx++] = (short)(sliceBase1 + segment + 1);

          }

          if(slice<numSlices-1)

          {

            indices[idx++] = (short)(sliceBase0 + segment);

            indices[idx++] = (short)(sliceBase1 + segment + 1);

            indices[idx++] = (short)(sliceBase1 + segment);

          }

        }

      }

    }

 

    public virtual void Initialize(DeviceManager devices)

    {

      // Remove previous buffer

 

      SafeDispose(ref _constBuf);

 

      CreatePipeline(devices);

 

      CreateInstancesForLevel(

        devices.DeviceDirect3D, _level

      );

 

      _clock = new Stopwatch();

      _clock.Start();

    }

 

    // Create the Direct3D pipeline for our rendering

 

    private void CreatePipeline(DeviceManager devices)

    {

      // Setup local variables

 

      _device = devices.DeviceDirect3D;

 

      var path =

        Windows.ApplicationModel.Package.Current.

          InstalledLocation.Path;

 

      // Loads vertex shader bytecode

 

      var vertexShaderByteCode =

        NativeFile.ReadAllBytes(path + "\\SimpleSphere_VS.fxo");

      _vertShader =

        new VertexShader(_device, vertexShaderByteCode);

 

      // Loads pixel shader bytecode

 

      _pxlShader =

        new PixelShader(

          _device,

          NativeFile.ReadAllBytes(path + "\\SimpleSphere_PS.fxo")

        );

 

      // Layout from VertexShader input signature

 

      _layout =

        new InputLayout(

          _device,

          vertexShaderByteCode,

          new[]

          {

            // Per-vertex data

 

            new InputElement(

              "POSITION", 0, Format.R32G32B32_Float, 0, 0

            ),

            new InputElement(

              "NORMAL", 0, Format.R32G32B32_Float, 12, 0

            ),

 

            // Per-instance data

 

            // Instance position

 

            new InputElement(

              "TEXCOORD", 0, Format.R32G32B32_Float, 0, 1,

              InputClassification.PerInstanceData, 1

            ),

 

            // Instance colour

 

            new InputElement(

              "TEXCOORD", 1, Format.R32G32B32A32_Float, 12, 1,

              InputClassification.PerInstanceData, 1

            ),

 

            // Instance radius

 

            new InputElement(

              "TEXCOORD", 2, Format.R32_Float, 28, 1,

              InputClassification.PerInstanceData, 1

            )

          }

        );

    }

 

    // Access the Apollonian web-service and create the instance

    // information from the results

 

    private async void CreateInstancesForLevel(Device dev, int lev)

    {

      // Set up our various arrays and populate them

 

      SphereDefinition[] instances = null;

 

      try

      {

        instances = await SpheresForLevel(lev);

        _instCount = instances.Length;

 

        // Create our buffers

 

        _idxBuf =

          ToDispose(

            Buffer.Create(dev, BindFlags.IndexBuffer, _indices)

          );

 

        _vertBufs =

          new Buffer[]

          {

            ToDispose(

              Buffer.Create(dev, BindFlags.VertexBuffer, _vertices)

            ),

            ToDispose(

              Buffer.Create(dev, BindFlags.VertexBuffer, instances)

            )

          };

 

        _vertStrides =

          new int[]

          {

            Utilities.SizeOf<BasicVertex>(),

            Utilities.SizeOf<SphereDefinition>()

          };

 

        _vertOffsets = new int[] { 0, 0 };

 

        // Create Constant Buffer

 

        _constBuf =

          ToDispose(

            new Buffer(

              dev,

              Utilities.SizeOf<Matrix>(),

              ResourceUsage.Default,

              BindFlags.ConstantBuffer,

              CpuAccessFlags.None,

              ResourceOptionFlags.None,

              0

            )

          );

      }

      catch { }

 

      if (OnInitCompleted != null)

      {

        OnInitCompleted(this, new EventArgs());

      }

    }

 

    // Generate the sphere instance information for a

    // particular level

 

    private async Task<SphereDefinition[]> SpheresForLevel(

      int level

    )

    {

      string responseText = await GetJsonStream(level);

      return SpheresFromJson(responseText);

    }

 

    // Access our web-service asynchronously and return the

    // results

 

    private async Task<string> GetJsonStream(int level)

    {

      HttpClient client = new HttpClient();

      string url =

        "http://apollonian.cloudapp.net/api/spheres/1/" +

        level.ToString();

      client.MaxResponseContentBufferSize = 1500000;

      HttpResponseMessage response = await client.GetAsync(url);

      return await response.Content.ReadAsStringAsync();

    }

 

    // Extract the sphere definitions from the JSON data

 

    private static SphereDefinition[] SpheresFromJson(

      string responseText

    )

    {

      // Create our list to return and the list of colors

 

      var spheres = new List<SphereDefinition>();

      var colors =

        new Vector4[]

        {

          Colors.Black.ToVector4(),

          Colors.Red.ToVector4(),

          Colors.Yellow.ToVector4(),

          Colors.Green.ToVector4(),

          Colors.Cyan.ToVector4(),

          Colors.Blue.ToVector4(),

          Colors.Magenta.ToVector4(),

          Colors.DarkGray.ToVector4(),

          Colors.Gray.ToVector4(),

          Colors.LightGray.ToVector4(),

          Colors.White.ToVector4()

        };

 

      // Our data contains an array at its root

 

      JsonArray root = JsonArray.Parse(responseText);

      foreach (JsonValue val in root)

      {

        // Each value in the array is actually an object

 

        JsonObject obj = val.GetObject();

 

        // Extract the properties we need from each object

 

        SphereDefinition def;

        def.instancePos.X = (float)obj.GetNamedNumber("X");

        def.instancePos.Y = (float)obj.GetNamedNumber("Y");

        def.instancePos.Z = (float)obj.GetNamedNumber("Z");

        def.rad = (float)obj.GetNamedNumber("R");

        var level = (int)obj.GetNamedNumber("L");

        def.col = colors[level <= 10 ? level : 10];

 

        // Only add spheres near the edge of the outer one

 

        if (def.instancePos.Length() + def.rad > 0.99)

          spheres.Add(def);

      }

      return spheres.ToArray();

    }

 

    // This is called in a loop

 

    public virtual void Render(TargetBase render)

    {

      if (_clock == null) return;

 

      var ctxt = render.DeviceManager.ContextDirect3D;

      var dev = render.DeviceManager.DeviceDirect3D;

 

      float width = (float)render.RenderTargetSize.Width;

      float height = (float)render.RenderTargetSize.Height;

 

      // Prepare matrices

 

      var view =

        Matrix.LookAtLH(

          new Vector3(0, 0, -5),

          new Vector3(0, 0, 0),

          Vector3.UnitY

        );

 

      var proj =

        Matrix.PerspectiveFovLH(

          (float)Math.PI / 4.0f,

          width / (float)height,

          0.1f,

          100.0f

        );

 

      var viewProj = Matrix.Multiply(view, proj);

 

      var time =

        (float)(_clock.ElapsedMilliseconds / 1000.0);

 

      // Set targets (this is mandatory in the loop)

 

      ctxt.OutputMerger.SetTargets(

        render.DepthStencilView,

        render.RenderTargetView

      );

 

      // Clear the views

 

      ctxt.ClearDepthStencilView(

        render.DepthStencilView,

        DepthStencilClearFlags.Depth,

        1.0f,

        0

      );

 

      ctxt.ClearRenderTargetView(

        render.RenderTargetView,

        Colors.Black

      );

 

      // If we have instances, let's display them

 

      if (_instCount > 0)

      {

        // Setup the pipeline

 

        ctxt.InputAssembler.InputLayout = _layout;

 

        ctxt.InputAssembler.PrimitiveTopology =

          PrimitiveTopology.TriangleList;

 

        ctxt.InputAssembler.SetVertexBuffers(

          0,

          _vertBufs,

          _vertStrides,

          _vertOffsets

        );

 

        ctxt.InputAssembler.SetIndexBuffer(

          _idxBuf,

          Format.R16_UInt,

          0

        );

 

        ctxt.VertexShader.SetConstantBuffer(

          0,

          _constBuf

        );

 

        ctxt.VertexShader.Set(_vertShader);

        ctxt.PixelShader.Set(_pxlShader);

 

        // Calculate WorldViewProj

 

        var worldViewProj =

          Matrix.Scaling(Scale) *

          Matrix.RotationY(-time) *

          viewProj;

        worldViewProj.Transpose();

 

        // Update constant buffer

 

        ctxt.UpdateSubresource(ref worldViewProj, _constBuf);

 

        // Draw the spheres

 

        ctxt.DrawIndexedInstanced(

          _idxCount,

          _instCount,

          0,

          0,

          0

        );

      }

    }

  }

}

I didn’t especially enjoy having to get down and dirty with Direct3D to write this app – I’d much rather make use of a higher-level framework, where one’s available – but I’m reasonably happy with the results. In the next post, we’ll add some touch-based UI enhancements, although I suspect that given the low-level nature of the current implementation I probably won’t end up with the same level of touch UI as was able easy to write for Android or even iOS. We’ll see,

By the way: in a previous post I’d mentioned the monochrome UI in VS11. It turns out that Microsoft is responding to Beta feedback to “increase the energy” (sigh) of the UI. A good move, I think.

blog comments powered by Disqus

10 Random Posts