September 2014

Sun Mon Tue Wed Thu Fri Sat
  1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30        








September 19, 2014

Exploding nested AutoCAD blocks using .NET

Some time ago I posted about how to use Entity.Explode() to do something similar to AutoCAD’s EXPLODE command. At the time it was mentioned in the comments that BlockReference.ExplodeToOwnerBlock() had some relative benefits, but it’s taken me some time to code up a simple sample to show how you might use it (Patrick’s recent comment reminded me I ought to, though).

Anyway, to end the week I thought I’d throw together a quick sample. BlockReference.ExplodeToOwnerBlock() doesn’t return a list of created objects, so I opted to capture this using a Database.ObjectAppended event handler and then recursively call our custom ExplodeBlock() function for any nested blocks that get created. We also then erase the originating entity (or entities, if called recursively), just as the EXPLODE command might.

Here’s the C# code:

using Autodesk.AutoCAD.ApplicationServices;

using Autodesk.AutoCAD.DatabaseServices;

using Autodesk.AutoCAD.EditorInput;

using Autodesk.AutoCAD.Runtime;

 

namespace Explosions

{

  public class Commands

  {

    [CommandMethod("EB")]

    public void ExplodeBock()

    {

      var doc = Application.DocumentManager.MdiActiveDocument;

      if (doc == null)

        return;

      var ed = doc.Editor;

      var db = doc.Database;

 

      // Ask the user to select the block

 

      var peo = new PromptEntityOptions("\nSelect block to explode");

      peo.SetRejectMessage("Must be a block.");

      peo.AddAllowedClass(typeof(BlockReference), false);

 

      var per = ed.GetEntity(peo);

 

      if (per.Status != PromptStatus.OK)

        return;

 

      using (var tr = db.TransactionManager.StartTransaction())

      {

        // Call our explode function recursively, starting

        // with the top-level block reference

        // (you can pass false as a 4th parameter if you

        // don't want originating entities erased)

 

        ExplodeBlock(tr, db, per.ObjectId);

 

        tr.Commit();

      }

    }

 

    private void ExplodeBlock(

      Transaction tr, Database db, ObjectId id, bool erase = true

    )

    {

      // Open out block reference - only needs to be readable

      // for the explode operation, as it's non-destructive

 

      var br = (BlockReference)tr.GetObject(id, OpenMode.ForRead);

 

      // We'll collect the BlockReferences created in a collection

 

      var toExplode = new ObjectIdCollection();

 

      // Define our handler to capture the nested block references

 

      ObjectEventHandler handler =

        (s, e) =>

        {

          if (e.DBObject is BlockReference)

          {

            toExplode.Add(e.DBObject.ObjectId);

          }

        };

 

      // Add our handler around the explode call, removing it

      // directly afterwards

 

      db.ObjectAppended += handler;

      br.ExplodeToOwnerSpace();

      db.ObjectAppended -= handler;

 

      // Go through the results and recurse, exploding the

      // contents

 

      foreach (ObjectId bid in toExplode)

      {

        ExplodeBlock(tr, db, bid, erase);

      }

 

      // We might also just let it drop out of scope

 

      toExplode.Clear();

 

      // To replicate the explode command, we're delete the

      // original entity

 

      if (erase)

      {

        br.UpgradeOpen();

        br.Erase();

        br.DowngradeOpen();

      }

    }

  }

}

That’s it for this week. Monday is a holiday in Neuchatel, so I’ll be back online on Tuesday. And then on Wednesday I’m heading to TEDxCERN – which promises to be really cool. Can’t wait!

September 17, 2014

Reminder: Exchange Apps Hackathon this weekend

There’s still time to participate in the Autodesk Exchange Apps Hackathon, a virtual event taking place this weekend (September 20-21). The point of this event is to encourage developers to post apps to the Autodesk Exchange Apps store, and we’re even paying cool, hard cash ($50 or $100, whether free or paid) for each app that gets published.

Hackathon

Presentations and discussions will include:

  • How to architect your app for Exchange:
    • How to build your AutoCAD® app
    • How to build your Autodesk® Revit® app
    • How to build your Autodesk® Inventor® app
    • How to build your Autodesk® 3ds Max® and Autodesk® Maya® apps
  • Handling IPN notifications
  • Implementing simple copy protection for your app
  • Architecting your app to sell on monthly subscription
  • Selling online web services on Exchange Apps
  • Making use of Autodesk ‘cloud’ services such as our new, web-based, zero-client 3D model viewer
  • Sustainability apps

Lots of great information to help kick-start your app development. Sign up now!

September 15, 2014

Exporting Minecraft data from AutoCAD

After last week’s post on importing Minecraft data – in this case from Tinkercad – into AutoCAD, in today’s post we’re going to focus on the ultimately more interesting use case of generating Minecraft data from AutoCAD. We’re going to see some code to dice up 3D AutoCAD geometry and generate blocks in a .schematics file using Substrate.

Our “dicing” process – a term I’ve just coined for iterating through a 3D space, chunk by chunk – is going to use a couple of different approaches for determining there’s any 3D geometry in each grid location. Firstly, though, we going generate a spatial index from the contents of the modelspace – a basic list of bounding boxes with the owning entity’s ObjectId (which could be optimised further by sorting based on location) – to decide whether we want to take a closer look at the geometry we find there.

If we get a “hit” from the spatial index, we can test the associated entity for whether the specific point we’re interested in actual does contain geometry. The specific test will vary based on the type of 3D object we find…

If it’s a Solid3d we can perform a simple test using the CheckInterference() method, passing in a cubic Solid3d occupying the location to test. This works fine, but will generate hits for internal cubes, too (i.e. we end up with a fully solid object, rather than just having blocks representing the shell). Ideally we would union the two solids and check the resultant volume to see if it’s an internal cube or not (if the volume doesn’t change then its internal), but that’s likely to be expensive. The ObjectARX equivalent, AcDb3dSolid::checkInterference() does allow this, but it’s more complicated from .NET. Right now we simply create blocks for internal locations, as well, which may also be what the user wants in many cases.

For Surface objects there’s a bit more to do: here we use ProjectOnToSurface(), passing in a DBPoint, to see whether there’s a point in the block that’s close to the surface. We do this for each of the location cube’s vertices, which may be overkill but seems to give the best results for complex surfaces. Needless to say, we stop checking for “clashes” the first time we get a hit in a particular location – there’s no need to keep looking (although we might want to if we wanted to get the best possible material for a block… for now we’re not worrying about materials at all).

To put the code through its paces, I went ahead and rebuilt a space shuttle model using the code in this previous post (although I performed the loft operations by hand – for some reason these don’t work for me anymore).

I then went and used the EMC command – changing the default block size to 0.1, to make the model more detailed – and then used IMC to reimport it into AutoCAD:

Shuttle export and reimport

You should bear in mind that AutoCAD’s representation of an imported model is a lot heavier that it would be in Minecraft – each block is a block reference or a 3D solid, depending – so complex models that have trouble being loaded back into AutoCAD will often import just fine into Minecraft.

Here’s the space shuttle schematics file imported into a new world in MCEdit. I chose “diamond” as the material (just to add a bit of a bling factor for my kids), but you could easily hardcode another choice of material or select it based on the geometry’s layer (etc.).

Diamond shuttle in MCEdit

The surface analysis algorithm could use some tweaking – you can see some holes on the sides where the surface is close to vertical, and the base is very thick – but it’s good enough for my purposes.

Here’s the C# code defining the EMC and IMC commands:

using Autodesk.AutoCAD.ApplicationServices;

using Autodesk.AutoCAD.DatabaseServices;

using Autodesk.AutoCAD.EditorInput;

using Autodesk.AutoCAD.Geometry;

using Autodesk.AutoCAD.Runtime;

using AcDb = Autodesk.AutoCAD.DatabaseServices;

using Substrate;

using System;

using System.Collections.Generic;

 

namespace Minecraft

{

  public class GeometryIndex

  {

    // We need a list of extents vs. ObjectIds

    // (some kind of spatial sorting might help with

    // performance, but for now it's just a flat list)

 

    List<Tuple<Extents3d, ObjectId>> _extList;

 

    public GeometryIndex()

    {

      _extList = new List<Tuple<Extents3d, ObjectId>>();

    }

 

    public int Size

    {

      get { return _extList.Count; }

    }

 

    public void PopulateIndex(BlockTableRecord ms, Transaction tr)

    {

      foreach (var id in ms)

      {

        var ent =

          tr.GetObject(id, OpenMode.ForRead) as

            Autodesk.AutoCAD.DatabaseServices.Entity;

        if (ent != null)

        {

          _extList.Add(

            new Tuple<Extents3d, ObjectId>(

              ent.GeometricExtents, id

            )

          );

        }

      }

    }

 

    internal ObjectIdCollection PotentialClashes(

      Point3d pt, double step

    )

    {

      var res = new ObjectIdCollection();

 

      foreach (var item in _extList)

      {

        var ext = item.Item1;

 

        if (

          pt.X + step >= ext.MinPoint.X &&

          pt.X <= ext.MaxPoint.X + step &&

          pt.Y + step >= ext.MinPoint.Y &&

          pt.Y <= ext.MaxPoint.Y + step &&

          pt.Z + step >= ext.MinPoint.Z &&

          pt.Z <= ext.MaxPoint.Z + step

        )

        {

          res.Add(item.Item2);

        }

      }

 

      return res;

    }

  }

 

  public class Commands

  {

    // Members that will be set by the EMC command and

    // picked up by the IMC command

 

    private double _blockSize = 1.0;

    private Point3d _origin = Point3d.Origin;

 

    [CommandMethod("IMC")]

    public void ImportMinecraft()

    {

      var doc = Application.DocumentManager.MdiActiveDocument;

      if (doc == null)

        return;

      var ed = doc.Editor;

      var db = doc.Database;

 

      // Request the name of the file to import

 

      var opts =

        new PromptOpenFileOptions(

          "Import from Minecraft"

        );

      opts.Filter =

        "Minecraft schematic (*.schematic)|*.schematic|" +

        "All files (*.*)|*.*";

      var pr = ed.GetFileNameForOpen(opts);

 

      if (pr.Status != PromptStatus.OK)

        return;

 

      // Read in the selected  Schematic file

 

      var schem =

        Substrate.ImportExport.Schematic.Import(pr.StringResult);

 

      if (schem == null)

      {

        ed.WriteMessage("\nCould not find Minecraft schematic.");

        return;

      }

 

      // Let the user choose the location of the geometry

 

      ed.WriteMessage(

        "\nDefault insert is {0}", _origin

      );

      var ppo = new PromptPointOptions("\nInsertion point or ");

      ppo.Keywords.Add("Default");

      ppo.AllowNone = true;

 

      var ppr = ed.GetPoint(ppo);

 

      Vector3d offset;

 

      if (ppr.Status == PromptStatus.Keyword)

      {

        offset = _origin.GetAsVector();

      }

      else if (ppr.Status == PromptStatus.OK)

      {

        offset = ppr.Value.GetAsVector();

      }

      else

      {

        return;

      }

 

      // Let the user choose the size of the block

 

      var pdo = new PromptDoubleOptions("\nEnter block size");

      pdo.AllowNegative = false;

      pdo.AllowNone = true;

      pdo.DefaultValue = _blockSize;

      pdo.UseDefaultValue = true;

 

      var pdr = ed.GetDouble(pdo);

 

      if (pdr.Status != PromptStatus.OK)

        return;

 

      _blockSize = pdr.Value;

      var step = _blockSize;

 

      // We only really care about the blocks

 

      var blks = schem.Blocks;

 

      // We can either create Solid3d objects for each Minecraft

      // block, or we can create a BlockTableRecord containing

      // a single Solid3d that we reference for each block

      // (if useBlock is set to true)

 

      var blkId = ObjectId.Null;

      var useBlock = true;

 

      using (var tr = db.TransactionManager.StartTransaction())

      {

        var bt =

          (BlockTable)tr.GetObject(

            db.BlockTableId, OpenMode.ForRead

          );

 

        if (useBlock)

        {

          bt.UpgradeOpen();

 

          // Create our block and add it to the db & transaction

 

          var btr = new BlockTableRecord();

          btr.Name = "Minecraft Block";

 

          blkId = bt.Add(btr);

          tr.AddNewlyCreatedDBObject(btr, true);

 

          // Create our cube and add it to the block & transaction

 

          var cube = new Solid3d();

          cube.CreateBox(step, step, step);

 

          btr.AppendEntity(cube);

          tr.AddNewlyCreatedDBObject(cube, true);

 

          bt.DowngradeOpen();

        }

 

        var ms =

          tr.GetObject(

            bt[BlockTableRecord.ModelSpace], OpenMode.ForWrite

          ) as BlockTableRecord;

        if (ms != null)

        {

          using (var pm = new ProgressMeter())

          {

            pm.Start("Importing Minecraft schematic");

            pm.SetLimit(blks.XDim * blks.YDim * blks.ZDim);

 

            // Create a cubic solid for each block

 

            for (int x = 0; x < blks.XDim; ++x)

            {

              for (int y = 0; y < blks.YDim; ++y)

              {

                for (int z = 0; z < blks.ZDim; ++z)

                {

                  var blk = blks.GetBlock(x, y, z);

                  if (blk != null && blk.Info.Name != "Air")

                  {

                    // Minecraft has a right-handed coordinate

                    // system with Z & Y swapped and Z negated

 

                    var disp =

                      new Point3d(x * step, -z * step, y * step) +

                      offset;

 

                    AcDb.Entity ent;

 

                    if (useBlock)

                    {

                      ent = new BlockReference(disp, blkId);

                    }

                    else

                    {

                      var sol = new Solid3d();

                      sol.CreateBox(step, step, step);

                      sol.TransformBy(

                        Matrix3d.Displacement(disp.GetAsVector())

                      );

                      ent = sol;

                    }

 

                    // Assign the layer based on the material

 

                    ent.LayerId =

                      LayerForMaterial(tr, db, blk.Info.Name);

 

                    ms.AppendEntity(ent);

                    tr.AddNewlyCreatedDBObject(ent, true);

                  }

                  pm.MeterProgress();

                  System.Windows.Forms.Application.DoEvents();

                }

              }

            }

            pm.Stop();

            System.Windows.Forms.Application.DoEvents();

          }

          tr.Commit();

        }

      }

 

      // Zoom to the model's extents

 

      ed.Command("_.ZOOM", "_EXTENTS");

    }

 

    private ObjectId LayerForMaterial(

      Transaction tr, Database db, string layname

    )

    {

      // If a layer with the material's name exists, return its id

 

      var lt =

        (LayerTable)tr.GetObject(db.LayerTableId, OpenMode.ForRead);

      if (lt.Has(layname))

      {

        return lt[layname];

      }

 

      // Otherwise create a new layer for this material

 

      var ltr = new LayerTableRecord();

      ltr.Name = layname;

 

      lt.UpgradeOpen();

      var ltrId = lt.Add(ltr);

      lt.DowngradeOpen();

 

      tr.AddNewlyCreatedDBObject(ltr, true);

 

      return ltrId;

    }

 

    [CommandMethod("EMC")]

    public void ExportMinecraft()

    {

      var doc = Application.DocumentManager.MdiActiveDocument;

      if (doc == null)

        return;

      var ed = doc.Editor;

      var db = doc.Database;

      var msId = SymbolUtilityServices.GetBlockModelSpaceId(db);

 

      // Request the name of the file to export to

 

      var opts =

        new PromptSaveFileOptions(

          "Export to Minecraft"

        );

      opts.Filter =

        "Minecraft schematic (*.schematic)|*.schematic|" +

        "All files (*.*)|*.*";

      var pr = ed.GetFileNameForSave(opts);

 

      if (pr.Status != PromptStatus.OK)

        return;

 

      var idx = new GeometryIndex();

 

      var emin = db.Extmin;

      var emax = db.Extmax;

 

      // Let the user choose the size of the block - offer a

      // default of a 50th of the diagonal length of the 3D extents

 

      var defSize =

        emax.GetAsVector().Subtract(emin.GetAsVector()).Length / 50;

 

      var pdo = new PromptDoubleOptions("\nEnter block size");

      pdo.AllowNegative = false;

      pdo.AllowNone = true;

      pdo.DefaultValue = defSize;

      pdo.UseDefaultValue = true;

 

      var pdr = ed.GetDouble(pdo);

 

      if (pdr.Status != PromptStatus.OK)

        return;

 

      _blockSize = pdr.Value;

      var step = _blockSize;

 

      _origin = new Point3d(emin.X, emax.Y, emin.Z);

 

      ed.WriteMessage(

        "\nExporting with block size of {0} at {1}.",

        step, _origin

      );

 

      // Set up our empty schematic container

 

      var schem =

        new Substrate.ImportExport.Schematic(

          (int)Math.Ceiling((emax.X - emin.X) / step),

          (int)Math.Ceiling((emax.Z - emin.Z) / step),

          (int)Math.Ceiling((emax.Y - emin.Y) / step)

        );

 

      using (var tr = db.TransactionManager.StartTransaction())

      {

        // Get our modelspace

 

        var ms =

          tr.GetObject(msId, OpenMode.ForRead) as BlockTableRecord;

        if (ms != null)

        {

          // Start by populating the spatial index based on the

          // contents of the modelspace

 

          idx.PopulateIndex(ms, tr);

 

          // We'll just use two materials - air and the model

          // material (which for now is "diamond")

 

          var air = new AlphaBlock(BlockType.AIR);

          var diamond = new AlphaBlock(BlockType.DIAMOND_BLOCK);

 

          using (var cube = new Solid3d())

          {

            // We'll use a single cube to test interference

 

            cube.CreateBox(step, step, step);

 

            var std2 = step / 2.0;

            var vecs =

              new Vector3d[]

              {

                new Vector3d(std2,std2,std2),

                new Vector3d(std2,std2,-std2),

                new Vector3d(std2,-std2,std2),

                new Vector3d(std2,-std2,-std2),

                new Vector3d(-std2,std2,std2),

                new Vector3d(-std2,std2,-std2),

                new Vector3d(-std2,-std2,std2),

                new Vector3d(-std2,-std2,-std2)

              };

            var blks = schem.Blocks;

 

            using (var pm = new ProgressMeter())

            {

              pm.Start("Exporting Minecraft schematic");

              pm.SetLimit(blks.XDim * blks.YDim * blks.ZDim);

 

              for (int x = 0; x < blks.XDim; ++x)

              {

                for (int y = 0; y < blks.YDim; ++y)

                {

                  for (int z = 0; z < blks.ZDim; ++z)

                  {

                    // Get the WCS point to test modespace contents

 

                    var wcsX = emin.X + step * x;

                    var wcsY = emax.Y + step * -z;

                    var wcsZ = emin.Z + step * y;

 

                    var pt = new Point3d(wcsX, wcsY, wcsZ);

 

                    // Check our point against bounding boxes

                    // to detect potential clashes

 

                    var ents = idx.PotentialClashes(pt, step);

 

                    // If we have some, verify using a more precise,

                    // per-entity interference check

 

                    if (ents.Count > 0)

                    {

                      var disp = pt.GetAsVector();

 

                      // Displace our interference cube to the

                      // location we want to test

 

                      cube.TransformBy(Matrix3d.Displacement(disp));

 

                      bool found = false;

 

                      // Check each of the potentially clashing

                      // entities against our test cube

 

                      foreach (ObjectId id in ents)

                      {

                        // For Solid3ds we simply check interference

                        // with our cube

 

                        var obj = tr.GetObject(id, OpenMode.ForRead);

                        var sol = obj as Solid3d;

                        if (sol != null)

                        {

                          if (sol.CheckInterference(cube))

                          {

                            // When we've found one we don't need to

                            // test the others

 

                            found = true;

                            break;

                          }

                        }

                        else

                        {

                          // For Surfaces we don't use the cube:

                          // we create a point at the location

                          // and project it onto the surface. If

                          // the resulting point is less than a

                          // step away, we assume we create the

                          // block at this location

 

                          var sur = obj as AcDb.Surface;

                          if (sur != null)

                          {

                            foreach (var v in vecs)

                            {

                              found =

                                SurfaceClash(sur, pt + v, step);

                              if (found)

                                break;

                            }

                          }

                        }

                      }

 

                      // Whether we've found a clash will drive

                      // whether we set the block to be stone or air

 

                      blks.SetBlock(x, y, z, found ? diamond : air);

 

                      // Displace the cube back again, ready for the

                      // next test

 

                      cube.TransformBy(Matrix3d.Displacement(-disp));

                    }

                    pm.MeterProgress();

                    System.Windows.Forms.Application.DoEvents();

                  }

                }

              }

              pm.Stop();

              System.Windows.Forms.Application.DoEvents();

            }

          }

 

          // Finally we write the block information to a schematics

          // file

 

          schem.Export(pr.StringResult);

        }

        tr.Commit();

      }

    }

 

    private static bool SurfaceClash(

      AcDb.Surface sur, Point3d pt, double step

    )

    {

      try

      {

        var found = false;

 

        using (var dbp = new DBPoint(pt))

        {

          var ps =

            sur.ProjectOnToSurface(

              dbp, Vector3d.ZAxis

            );

 

          if (ps.Length > 0)

          {

            foreach (var p in ps)

            {

              var dbp2 = p as DBPoint;

              if (!found && dbp2 != null)

              {

                // If a discovered point is within 2 block's width

                // we consider it a hit

 

                var dist = dbp2.Position - dbp.Position;

                found = (dist.Length < 2.0 * step);

              }

              p.Dispose();

            }

          }

 

          if (found)

          {

            return true;

          }

        }

      }

      catch { }

 

      return false;

    }

  }

}

That’s all I currently have planned in terms of Minecraft-related posts, but if anyone has additional suggestions on where to take this, please do share them.

Unrelatedly, Minecraft is in the news a lot at the moment with the prospective acquisition of Mojang by Microsoft. It seems a lot of fans are concerned by this prospect, but one way or another it’ll be interesting to see how it all plays out…

September 11, 2014

Importing Minecraft data into AutoCAD

A mere 2 among 100 million registered users, my boys are crazy about Minecraft. I’ve been looking into how I might be able to help them use Autodesk tools (well, AutoCAD) to generate Minecraft content. In this post we’ll take a look at importing Minecraft data into AutoCAD, but ultimately the creation/export story is clearly more interesting (something we’ll look at in the next post, I expect).

To investigate dealing with Minecraft data – bearing in mind I didn’t actually know anything much about its file formats – I took a look at the Minecraft export you can perform from Tinkercad, which has been part of that product for just over a year. I took one of my algorithmically-created Tinkercad designs and clicked on “Download for Minecraft”:Algorithmic objects in Tinkercad

This created a local “more_knots.schematic” file, which presumably has information that Minecraft can make sense of. To check this out, I went and installed MCEdit and imported the schematic file into a new world. It was quite fun to see the Tinkercad geometry appear in a Minecraft-like environment:

Tinkercad geometry in MCEdit

Next step, then, was to work out how to get access to the “.schematics” format from .NET. A quick web-search led me to Substrate. I cloned it from GitHub and built it into an AutoCAD plug-in that uses the ImportExport capability to bring in a Schematic file.

It was then a reasonably simple matter to access the blocks and create cubic solids at the right locations to represent them. The only tricky piece, here, is that Minecraft uses a right-handed coordinate system with Z and Y swapped – from our perspective, anyway – and then the Y-axis negated… so it’s X, –Z, Y, I suppose. Because the Y axis is negated – and the geometry will be relative to an origin that isn’t specified in the .schematics file – the position of the model may well need to be moved if you want to check its overlap with source geometry. That’s why the user can select the position and the block size in the import command (which we will set in memory from our export command, making it really easy for the user to export and then reimport to check the quality).

Rather than just creating hundreds or thousands of cubic Solid3d objects, I’ve coded the (default) option to create a single Solid3d in a BlockTableRecord and then create a BlockReference for each Minecraft block. This has advantages both from a file size and memory consumption perspective (AutoCAD’s 3D graphics system is optimised for instanced geometry such as block references).

The code adds blocks to layers based on the names of their materials (I’ve also neglected adding “Air” blocks to the drawing, for obvious reasons). It’s then up to the user to assign appropriate colours to the various layers, as they see fit.

Here’s the Tinkercad data brought into AutoCAD (with my own layer colouring) using the IMC command:

Tinkercad geometry in AutoCAD

Here’s the C# code that implements the IMC command, performing a simple import of Minecraft data:

using Autodesk.AutoCAD.ApplicationServices;

using Autodesk.AutoCAD.DatabaseServices;

using Autodesk.AutoCAD.EditorInput;

using Autodesk.AutoCAD.Geometry;

using Autodesk.AutoCAD.Runtime;

 

namespace Minecraft

{

  public class Commands

  {

    // Members that will be set by the EMC command and

    // picked up by the IMC command

 

    private double _blockSize = 1.0;

    private Point3d _origin = Point3d.Origin;

 

    [CommandMethod("IMC")]

    public void ImportMinecraft()

    {

      var doc = Application.DocumentManager.MdiActiveDocument;

      if (doc == null)

        return;

      var ed = doc.Editor;

      var db = doc.Database;

 

      // Request the name of the file to import

 

      var opts =

        new PromptOpenFileOptions(

          "Import from Minecraft"

        );

      opts.Filter =

        "Minecraft schematic (*.schematic)|*.schematic|" +

        "All files (*.*)|*.*";

      var pr = ed.GetFileNameForOpen(opts);

 

      if (pr.Status != PromptStatus.OK)

        return;

 

      // Read in the selected  Schematic file

 

      var schem =

        Substrate.ImportExport.Schematic.Import(pr.StringResult);

 

      if (schem == null)

      {

        ed.WriteMessage("\nCould not find Minecraft schematic.");

        return;

      }

 

      // Let the user choose the location of the geometry

 

      ed.WriteMessage(

        "\nDefault insert is {0}", _origin

      );

      var ppo = new PromptPointOptions("\nInsertion point or ");

      ppo.Keywords.Add("Default");

      ppo.AllowNone = true;

 

      var ppr = ed.GetPoint(ppo);

 

      Vector3d offset;

 

      if (ppr.Status == PromptStatus.Keyword)

      {

        offset = _origin.GetAsVector();

      }

      else if (ppr.Status == PromptStatus.OK)

      {

        offset = ppr.Value.GetAsVector();

      }

      else

      {

        return;

      }

 

      // Let the user choose the size of the block

 

      var pdo = new PromptDoubleOptions("\nEnter block size");

      pdo.AllowNegative = false;

      pdo.AllowNone = true;

      pdo.DefaultValue = _blockSize;

      pdo.UseDefaultValue = true;

 

      var pdr = ed.GetDouble(pdo);

 

      if (pdr.Status != PromptStatus.OK)

        return;

 

      _blockSize = pdr.Value;

      var step = _blockSize;

 

      // We only really care about the blocks

 

      var blks = schem.Blocks;

 

      // We can either create Solid3d objects for each Minecraft

      // block, or we can create a BlockTableRecord containing

      // a single Solid3d that we reference for each block

      // (if useBlock is set to true)

 

      var blkId = ObjectId.Null;

      var useBlock = true;

 

      using (var tr = db.TransactionManager.StartTransaction())

      {

        var bt =

          (BlockTable)tr.GetObject(

            db.BlockTableId, OpenMode.ForRead

          );

 

        if (useBlock)

        {

          bt.UpgradeOpen();

 

          // Create our block and add it to the db & transaction

 

          var btr = new BlockTableRecord();

          btr.Name = "Minecraft Block";

 

          blkId = bt.Add(btr);

          tr.AddNewlyCreatedDBObject(btr, true);

 

          // Create our cube and add it to the block & transaction

 

          var cube = new Solid3d();

          cube.CreateBox(step, step, step);

 

          btr.AppendEntity(cube);

          tr.AddNewlyCreatedDBObject(cube, true);

 

          bt.DowngradeOpen();

        }

 

        var ms =

          tr.GetObject(

            bt[BlockTableRecord.ModelSpace], OpenMode.ForWrite

          ) as BlockTableRecord;

        if (ms != null)

        {

          using (var pm = new ProgressMeter())

          {

            pm.Start("Importing Minecraft schematic");

            pm.SetLimit(blks.XDim * blks.YDim * blks.ZDim);

 

            // Create a cubic solid for each block

 

            for (int x = 0; x < blks.XDim; ++x)

            {

              for (int y = 0; y < blks.YDim; ++y)

              {

                for (int z = 0; z < blks.ZDim; ++z)

                {

                  var blk = blks.GetBlock(x, y, z);

                  if (blk != null && blk.Info.Name != "Air")

                  {

                    // Minecraft has a right-handed coordinate

                    // system with Z & Y swapped and Z negated

 

                    var disp =

                      new Point3d(x * step, -z * step, y * step) +

                      offset;

 

                    AcDb.Entity ent;

 

                    if (useBlock)

                    {

                      ent = new BlockReference(disp, blkId);

                    }

                    else

                    {

                      var sol = new Solid3d();

                      sol.CreateBox(step, step, step);

                      sol.TransformBy(

                        Matrix3d.Displacement(disp.GetAsVector())

                    );

                      ent = sol;

                    }

 

                    // Assign the layer based on the material

 

                    ent.LayerId =

                      LayerForMaterial(tr, db, blk.Info.Name);

 

                    ms.AppendEntity(ent);

                    tr.AddNewlyCreatedDBObject(ent, true);

                  }

                  pm.MeterProgress();

                  System.Windows.Forms.Application.DoEvents();

                }

              }

            }

            pm.Stop();

            System.Windows.Forms.Application.DoEvents();

          }

          tr.Commit();

        }

      }

 

      // Zoom to the model's extents

 

      ed.Command("_.ZOOM", "_EXTENTS");

    }

 

    private ObjectId LayerForMaterial(

      Transaction tr, Database db, string layname

    )

    {

      // If a layer with the material's name exists, return its id

 

      var lt =

        (LayerTable)tr.GetObject(db.LayerTableId, OpenMode.ForRead);

      if (lt.Has(layname))

      {

        return lt[layname];

      }

 

      // Otherwise create a new layer for this material

 

      var ltr = new LayerTableRecord();

      ltr.Name = layname;

 

      lt.UpgradeOpen();

      var ltrId = lt.Add(ltr);

      lt.DowngradeOpen();

 

      tr.AddNewlyCreatedDBObject(ltr, true);

 

      return ltrId;

    }

  }

}

So far so good! It’s not the best way to bring data from Tinkercad into AutoCAD, but then that’s not the point, of course. This is just about getting access to Minecraft data before we look at the more interesting use case of dicing the current AutoCAD model and generating a .schematic output file.

September 09, 2014

SINDEX 2014 – Switzerland’s industrial automation trade show

Last week I spent a morning at SINDEX 2014, which is apparently Switzerland’s largest technology exhibition. Its main focus is on industrial electronics and automation: not exactly fields I know a lot about, but ones that I do find to be extremely interesting. There were 400 or so exhibitors, many focused on providing sensors and electrical equipment, others providing complete automation solutions.

SINDEX 2014

There was a serious focus on robotics, for instance, something in which regular readers will know I have a strong interest. This post contains a few examples of the robots that were on show.

This robot, from Stäubli, is moving boxes of Tic Tacs around at high speed. Yes, the video is in no way sped up – it really moved that quickly.




Increasing the ability for robots to function more autonomously typically involves computer vision and image processing techniques, a very cool – and increasingly important – part of the technology landscape. Some of the most interesting exhibits, for me, were the ones using integrated depth cameras (e.g. Kinect), laser scanners or just basic image processing to enhance a robot’s ability to deal with varying situations.

Take this system from Isra Vision




… where the robot is picking parts out of a bin based on input from a laser scanner:

The ceiling-mounted laser scanner

The system scans the top level of the bin and matches portions of the point cloud against a known 3D model of the part. That way the robot can effectively position itself to pick up the “best” part from the bin and then go ahead and do something with it.

The Isra system dashboard

If you look at the above video – at around the 45 second mark – you can see a physical debugging console showing the code the robot’s controller is stepping through. Apparently most manufacturers have their own (often quite arcane) languages for robotic control (again – not a field I have experience with… I’d be very happy to get comments from people who do). It was certainly very interesting to see the code in action.

Of course there were a number of fun technology demonstrations that have no serious, practical purpose – and yet demonstrate the technology’s potential – something I admit I like a great deal (check my next post for one more example of this ;-).

For instance, here’s the Xylobot – created at HEIG-VD – playing the Dance of the Sugar Plum Fairy:




It was programmed to play a number of other tunes, too. Fun stuff!

Update:

Martin Müller made a very valid comment that the Xylobot is in some ways quite ugly: not necessarily in terms of its physical appearance but in terms of its perfect precision. I expect it’d be possible to program in a little imperfection, of course – if the goal was to make the performance more “human” – but that clearly wasn’t the point of this particular technology demonstration.

This reminded me of another video I took of some more “sensitive” robots, though, this time made by KUKA. These are not only elegant from a design perspective but from the “care” with which they can perform intricate tasks. Perhaps not beautiful in the way they replicate human imperfection, but nonetheless impressive…


September 08, 2014

Translating between AutoCAD drawing points and geographic locations using .NET – Part 3

I hadn’t planned on writing a third part to this series, but then Mads Paulin – who I mentioned in the last post – got back to me with some information that’s really worth sharing.

I’d pinged Mads while looking for a way to list the available coordinate systems to the user (another question from Coralie Jacobi, who had originally kicked off the series). I’d found out that the coordinate system definitions are primarily stored in this folder…

C:\ProgramData\Autodesk\Geospatial Coordinate Systems 14.01

… but I hadn’t worked out to extract any information from the contained CSD files.

Mads delivered the goods: he explained there’s a GeoCoordinateSystem class that contains a static method CreateAll(). The returned array of GeoCoordinateSystem classes can then be iterated, accessing the information you want from each one.

There’s also a static Create() method which lets you instantiate a single GeoCoordinateSystem class based on the XML you retrieve from the GeoLocation object. So we can now rip out our previous DynamicXML implementation (I’m glad we went through it, though: it’s still an interesting technique for people needing to retrieve data from XML). Mads will be happy: as the architect for the feature he was temporarily embarrassed by the hoops I ended up jumping through to get at this simple bit of information. :-)

Here’s the updated C# code. Look in particular for the new LCS (ListCoordinateSystems) command and the greatly simplified PrintCoordinateSystem() helper function. (The LLPF and PFLL commands should work just as they did, last time around, of course.)

using Autodesk.AutoCAD.ApplicationServices;

using Autodesk.AutoCAD.DatabaseServices;

using Autodesk.AutoCAD.EditorInput;

using Autodesk.AutoCAD.Geometry;

using Autodesk.AutoCAD.Runtime;

using System;

 

namespace GeoLocationAPI

{

  public class Commands

  {

    [CommandMethod("IGR")]

    public void InsertGeoRef()

    {

      var doc = Application.DocumentManager.MdiActiveDocument;

      if (doc == null)

        return;

      var ed = doc.Editor;

      var db = doc.Database;

      var msId = SymbolUtilityServices.GetBlockModelSpaceId(db);

 

      if (HasGeoData(db))

      {

        // Report and return: could also open the object for

        // write and modify its properties, of course

 

        ed.WriteMessage("\nDrawing already has geo-location data!");

        return;

      }

 

      // Let's create some geolocation data for this drawing,

      // using a handy method to add it to the modelspace

      // (it gets added to the extension dictionary)

 

      var data = new GeoLocationData();

      data.BlockTableRecordId = msId;

      data.PostToDb();

 

      // We're going to define our geolocation in terms of

      // latitude/longitude using the Mercator projection

      // http://en.wikipedia.org/wiki/Mercator_projection

 

      data.CoordinateSystem = "WORLD-MERCATOR";

      data.TypeOfCoordinates = TypeOfCoordinates.CoordinateTypeGrid;

 

      // Use the lat-long for La Tene, my local "beach"

      // (it's on a lake, after all :-)     

 

      var geoPt = new Point3d(7.019438, 47.005247, 0);

 

      // Transform from a geographic to a modelspace point

      // and add the information to our geolocation data

 

      var wcsPt = data.TransformFromLonLatAlt(geoPt);

      data.DesignPoint = wcsPt;

      data.ReferencePoint = geoPt;

 

      // Let's launch the GEOMAP command to show our geographic

      // overlay

 

      ed.Command("_.GEOMAP", "_AERIAL");

 

      // Now we'll add a circle around our location

      // and that will provide the extents for our zoom

 

      using (var tr = db.TransactionManager.StartTransaction())

      {

        var ms =

          tr.GetObject(msId, OpenMode.ForWrite) as BlockTableRecord;

        if (ms != null)

        {

          // Add a red circle of 7K units radius

          // centred on our point

 

          var circle = new Circle(wcsPt, Vector3d.ZAxis, 7000);

          circle.ColorIndex = 1;

          ms.AppendEntity(circle);

          tr.AddNewlyCreatedDBObject(circle, true);

        }

        tr.Commit();

      }

 

      // And we'll zoom to the circle's extents

 

      ed.Command("_.ZOOM", "_OBJECT", "_L", "");

    }

 

    [CommandMethod("CGI")]

    public void CreateGeoMapImage()

    {

      var doc = Application.DocumentManager.MdiActiveDocument;

      if (doc == null)

        return;

      var ed = doc.Editor;

      var db = doc.Database;

 

      // Get the first corner of our area to convert to a

      // GeomapImage

 

      var ppo = new PromptPointOptions("\nSpecify first corner");

      var ppr = ed.GetPoint(ppo);

      if (ppr.Status != PromptStatus.OK)

        return;

 

      var first = ppr.Value;

 

      // And get the second point as a corner (to rubber-band

      // the selection)

 

      var pco =

        new PromptCornerOptions("\nSpecify second corner", first);

      ppr = ed.GetCorner(pco);

 

      if (ppr.Status != PromptStatus.OK)

        return;

 

      var second = ppr.Value;

 

      // We'll use an event handler on the Database to check for

      // GeomapImage entities being added

      // (we'll use a lambda but assigned to a variable to be

      // able to remove it, afterwards)

 

      ObjectId giId = ObjectId.Null;

      ObjectEventHandler handler =

        (s, e) =>

        {

          if (e.DBObject is GeomapImage)

          {

            giId = e.DBObject.ObjectId;

          }

        };

 

      // Simply call the GEOMAPIMAGE command with the two points

 

      db.ObjectAppended += handler;

      ed.Command("_.GEOMAPIMAGE", first, second);

      db.ObjectAppended -= handler;

 

      // Only continue if we've collected a valid ObjectId

 

      if (giId == ObjectId.Null)

        return;

 

      // Open the entity and change some values

 

      try

      {

        using (var tr = doc.TransactionManager.StartTransaction())

        {

          // Get each object and check if it's a GeomapImage

 

          var gi =

            tr.GetObject(giId, OpenMode.ForWrite) as GeomapImage;

          if (gi != null)

          {

            // Let's adjust the brightmess/contrast/fade of the

            // GeomapImage

 

            gi.Brightness = 90;

            gi.Contrast = 40;

            gi.Fade = 20;

 

            // And make sure it's at the right resolution and

            // shows both aerial and road information

 

            gi.Resolution = GeomapResolution.Optimal;

            gi.MapType = GeomapType.Hybrid;

 

            gi.UpdateMapImage(true);

          }

 

          tr.Commit();

        }

      }

      catch (Autodesk.AutoCAD.Runtime.Exception)

      {

        ed.WriteMessage(

          "\nUnable to update geomap image entity." +

          "\nPlease check your internet connectivity and call " +

          "GEOMAPIMAGEUPDATE."

        );

      }

    }

 

    [CommandMethod("LCS")]

    public void ListCoordinateSystems()

    {

      var doc = Application.DocumentManager.MdiActiveDocument;

      if (doc == null)

        return;

      var ed = doc.Editor;

 

      //

      var css = GeoCoordinateSystem.CreateAll();

 

      ed.WriteMessage("\nAvailable coordinate systems:\n");

 

      int i=1;

      foreach (var cs in css)

      {

        ed.WriteMessage("\n{0} {1}", i, cs.ID);

        ++i;

      }

    }

 

    [CommandMethod("LLFP")]

    public void LatLongFromPoint()

    {

      var doc = Application.DocumentManager.MdiActiveDocument;

      if (doc == null)

        return;

      var ed = doc.Editor;

      var db = doc.Database;

 

      if (!HasGeoData(db))

      {

        ed.WriteMessage(

          "\nCurrent drawing has no geo-location information."

        );

        return;

      }

 

      // Get the drawing point to be translated into a lat-lon

 

      var ppo = new PromptPointOptions("\nSpecify point");

      var ppr = ed.GetPoint(ppo);

      if (ppr.Status != PromptStatus.OK)

        return;

 

      var dwgPt = ppr.Value;

 

      // Translate the drawing point to a lat-lon

 

      var res = TranslateGeoPoint(db, dwgPt, true);

 

      // Print any coordinate system information

 

      PrintCoordinateSystem(ed, res.Item2);

 

      // And then the point itself

 

      var lonlat = res.Item1;

      ed.WriteMessage(

        "\nLatitude-longitude is {0},{1}", lonlat.Y, lonlat.X

      );

    }

 

    [CommandMethod("PFLL")]

    public void PointFromLatLong()

    {

      var doc = Application.DocumentManager.MdiActiveDocument;

      if (doc == null)

        return;

      var ed = doc.Editor;

      var db = doc.Database;

 

      if (!HasGeoData(db))

      {

        ed.WriteMessage(

          "\nCurrent drawing has no geo-location information."

        );

        return;

      }

 

      // Get the latitude and longitude to be translated

      // to a drawing point

 

      var pdo = new PromptDoubleOptions("\nEnter latitude");

      var pdr = ed.GetDouble(pdo);

      if (pdr.Status != PromptStatus.OK)

        return;

 

      var lat = pdr.Value;

 

      pdo.Message = "\nEnter longitude";

      pdr = ed.GetDouble(pdo);

      if (pdr.Status != PromptStatus.OK)

        return;

 

      var lon = pdr.Value;

 

      var lonlat = new Point3d(lon, lat, 0.0);

 

      // Translate the lat-lon to a drawing point

 

      var res = TranslateGeoPoint(db, lonlat, false);

 

      // Print any coordinate system information

 

      ed.WriteMessage(res.Item2);

 

      PrintCoordinateSystem(ed, res.Item2);

 

      // And then the point itself

 

      var dwgPt = res.Item1;

      ed.WriteMessage(

        "\nDrawing point is {0},{1},{2}", dwgPt.X, dwgPt.Y, dwgPt.Z

      );

    }

 

    private static void PrintCoordinateSystem(Editor ed, string xml)

    {

      var gcs = GeoCoordinateSystem.Create(xml);

      ed.WriteMessage("\nCoordinate system: {0}", gcs.ID);

    }

 

    private Tuple<Point3d, string> TranslateGeoPoint(

      Database db, Point3d inPt, bool fromDwg

    )

    {

      using (

        var tr = db.TransactionManager.StartOpenCloseTransaction()

      )

      {

        // Get the drawing's GeoLocation object

 

        var gd =

          tr.GetObject(db.GeoDataObject, OpenMode.ForRead)

            as GeoLocationData;

 

        // Get the output point...

        // dwg2lonlat if fromDwg is true,

        // lonlat2dwg otherwise

 

        var outPt =

          (fromDwg ?

            gd.TransformToLonLatAlt(inPt) :

            gd.TransformFromLonLatAlt(inPt)

          );

 

        var cs = gd.CoordinateSystem;

 

        tr.Commit();

 

        return new Tuple<Point3d, string>(outPt, cs);

      }

    }

 

    private static bool HasGeoData(Database db)

    {

      // Check whether the drawing already has geolocation data

 

      bool hasGeoData = false;

      try

      {

        var gdId = db.GeoDataObject;

        hasGeoData = true;

      }

      catch { }

      return hasGeoData;

    }

  }

}

When we run the LCS command, we actually see the tail end of a looong list of coordinate systems:

6446 WGS72be/b.UTM-16N

6447 WGS72be/b.UTM-16S

6448 WGS72be/b.UTM-17N

6449 WGS72be/b.UTM-17S

6450 WGS72be/b.UTM-18N

6451 WGS72be/b.UTM-18S

6452 WGS72be/b.UTM-19N

6453 WGS72be/b.UTM-19S

6454 WGS72be/b.UTM-1N

6455 WGS72be/b.UTM-1S

6456 WGS72be/b.UTM-20N

6457 WGS72be/b.UTM-20S

6458 WGS72be/b.UTM-21N

6459 WGS72be/b.UTM-21S

6460 WGS72be/b.UTM-22N

6461 WGS72be/b.UTM-22S

6462 WGS72be/b.UTM-23N

6463 WGS72be/b.UTM-23S

6464 WGS72be/b.UTM-24N

6465 WGS72be/b.UTM-24S

6466 WGS72be/b.UTM-25N

6467 WGS72be/b.UTM-25S

6468 WGS72be/b.UTM-26N

6469 WGS72be/b.UTM-26S

6470 WGS72be/b.UTM-27N

6471 WGS72be/b.UTM-27S

6472 WGS72be/b.UTM-28N

6473 WGS72be/b.UTM-28S

6474 WGS72be/b.UTM-29N

6475 WGS72be/b.UTM-29S

6476 WGS72be/b.UTM-2N

6477 WGS72be/b.UTM-2S

6478 WGS72be/b.UTM-30N

6479 WGS72be/b.UTM-30S

6480 WGS72be/b.UTM-31N

6481 WGS72be/b.UTM-31S

6482 WGS72be/b.UTM-32N

6483 WGS72be/b.UTM-32S

6484 WGS72be/b.UTM-33N

6485 WGS72be/b.UTM-33S

6486 WGS72be/b.UTM-34N

6487 WGS72be/b.UTM-34S

6488 WGS72be/b.UTM-35N

6489 WGS72be/b.UTM-35S

6490 WGS72be/b.UTM-36N

6491 WGS72be/b.UTM-36S

6492 WGS72be/b.UTM-37N

6493 WGS72be/b.UTM-37S

6494 WGS72be/b.UTM-38N

6495 WGS72be/b.UTM-38S

6496 WGS72be/b.UTM-39N

6497 WGS72be/b.UTM-39S

6498 WGS72be/b.UTM-3N

6499 WGS72be/b.UTM-3S

6500 WGS72be/b.UTM-40N

6501 WGS72be/b.UTM-40S

6502 WGS72be/b.UTM-41N

6503 WGS72be/b.UTM-41S

6504 WGS72be/b.UTM-42N

6505 WGS72be/b.UTM-42S

6506 WGS72be/b.UTM-43N

6507 WGS72be/b.UTM-43S

6508 WGS72be/b.UTM-44N

6509 WGS72be/b.UTM-44S

6510 WGS72be/b.UTM-45N

6511 WGS72be/b.UTM-45S

6512 WGS72be/b.UTM-46N

6513 WGS72be/b.UTM-46S

6514 WGS72be/b.UTM-47N

6515 WGS72be/b.UTM-47S

6516 WGS72be/b.UTM-48N

6517 WGS72be/b.UTM-48S

6518 WGS72be/b.UTM-49N

6519 WGS72be/b.UTM-49S

6520 WGS72be/b.UTM-4N

6521 WGS72be/b.UTM-4S

6522 WGS72be/b.UTM-50N

6523 WGS72be/b.UTM-50S

6524 WGS72be/b.UTM-51N

6525 WGS72be/b.UTM-51S

6526 WGS72be/b.UTM-52N

6527 WGS72be/b.UTM-52S

6528 WGS72be/b.UTM-53N

6529 WGS72be/b.UTM-53S

6530 WGS72be/b.UTM-54N

6531 WGS72be/b.UTM-54S

6532 WGS72be/b.UTM-55N

6533 WGS72be/b.UTM-55S

6534 WGS72be/b.UTM-56N

6535 WGS72be/b.UTM-56S

6536 WGS72be/b.UTM-57N

6537 WGS72be/b.UTM-57S

6538 WGS72be/b.UTM-58N

6539 WGS72be/b.UTM-58S

6540 WGS72be/b.UTM-59N

6541 WGS72be/b.UTM-59S

6542 WGS72be/b.UTM-5N

6543 WGS72be/b.UTM-5S

6544 WGS72be/b.UTM-60N

6545 WGS72be/b.UTM-60S

6546 WGS72be/b.UTM-6N

6547 WGS72be/b.UTM-6S

6548 WGS72be/b.UTM-7N

6549 WGS72be/b.UTM-7S

6550 WGS72be/b.UTM-8N

6551 WGS72be/b.UTM-8S

6552 WGS72be/b.UTM-9N

6553 WGS72be/b.UTM-9S

6554 WGS84.AusAntarctic/LM

6555 WGS84.BLM-14NF

6556 WGS84.BLM-15NF

6557 WGS84.BLM-16NF

6558 WGS84.BLM-17NF

6559 WGS84.CanLam.3

6560 WGS84.CRTM05

6561 WGS84.PlateCarree

6562 WGS84.PseudoMercator

6563 WGS84.SCAR-SP19-20

6564 WGS84.SCAR-SP21-22

6565 WGS84.SCAR-SP23-24

6566 WGS84.SCAR-SQ01-02

6567 WGS84.SCAR-SQ19-20

6568 WGS84.SCAR-SQ21-22

6569 WGS84.SCAR-SQ37-38

6570 WGS84.SCAR-SQ39-40

6571 WGS84.SCAR-SQ41-42

6572 WGS84.SCAR-SQ43-44

6573 WGS84.SCAR-SQ45-46

6574 WGS84.SCAR-SQ47-48

6575 WGS84.SCAR-SQ49-50

6576 WGS84.SCAR-SQ51-52

6577 WGS84.SCAR-SQ53-54

6578 WGS84.SCAR-SQ55-56

6579 WGS84.SCAR-SQ57-58

6580 WGS84.SCAR-SR13-14

6581 WGS84.SCAR-SR15-16

6582 WGS84.SCAR-SR17-18

6583 WGS84.SCAR-SR19-20

6584 WGS84.SCAR-SR27-28

6585 WGS84.SCAR-SR29-30

6586 WGS84.SCAR-SR31-32

6587 WGS84.SCAR-SR33-34

6588 WGS84.SCAR-SR35-36

6589 WGS84.SCAR-SR37-38

6590 WGS84.SCAR-SR39-40

6591 WGS84.SCAR-SR41-42

6592 WGS84.SCAR-SR43-44

6593 WGS84.SCAR-SR45-46

6594 WGS84.SCAR-SR47-48

6595 WGS84.SCAR-SR49-50

6596 WGS84.SCAR-SR51-52

6597 WGS84.SCAR-SR53-54

6598 WGS84.SCAR-SR55-56

6599 WGS84.SCAR-SR57-58

6600 WGS84.SCAR-SR59-60

6601 WGS84.SCAR-SS04-06

6602 WGS84.SCAR-SS07-09

6603 WGS84.SCAR-SS10-12

6604 WGS84.SCAR-SS13-15

6605 WGS84.SCAR-SS16-18

6606 WGS84.SCAR-SS19-21

6607 WGS84.SCAR-SS25-27

6608 WGS84.SCAR-SS28-30

6609 WGS84.SCAR-SS31-33

6610 WGS84.SCAR-SS34-36

6611 WGS84.SCAR-SS37-39

6612 WGS84.SCAR-SS40-42

6613 WGS84.SCAR-SS43-45

6614 WGS84.SCAR-SS46-48

6615 WGS84.SCAR-SS49-51

6616 WGS84.SCAR-SS52-54

6617 WGS84.SCAR-SS55-57

6618 WGS84.SCAR-SS58-60

6619 WGS84.SCAR-ST01-04

6620 WGS84.SCAR-ST05-08

6621 WGS84.SCAR-ST09-12

6622 WGS84.SCAR-ST13-16

6623 WGS84.SCAR-ST17-20

6624 WGS84.SCAR-ST21-24

6625 WGS84.SCAR-ST25-28

6626 WGS84.SCAR-ST29-32

6627 WGS84.SCAR-ST33-36

6628 WGS84.SCAR-ST37-40

6629 WGS84.SCAR-ST41-44

6630 WGS84.SCAR-ST45-48

6631 WGS84.SCAR-ST49-52

6632 WGS84.SCAR-ST53-56

6633 WGS84.SCAR-ST57-60

6634 WGS84.TM-116SE

6635 WGS84.TM-132SE

6636 WGS84.TM-36SE

6637 WGS84.TM-6NE

6638 WGS84.UPSNorth

6639 WGS84.UPSSouth

6640 WGS84.USGS-AntarticMtn

6641 WGS84.Winkel

6642 WI-C

6643 WI-N

6644 WI-S

6645 WI27-TM

6646 WI83-C

6647 WI83-CF

6648 WI83-N

6649 WI83-NF

6650 WI83-S

6651 WI83-SF

6652 WI83-TM

6653 WIHP-C

6654 WIHP-CF

6655 WIHP-N

6656 WIHP-NF

6657 WIHP-S

6658 WIHP-SF

6659 WilkinMN-F

6660 WilkinMN-IF

6661 WilkinMN-M

6662 WinnebagoWI-F

6663 WinnebagoWI-IF

6664 WinnebagoWI-M

6665 WinonaMN-F

6666 WinonaMN-IF

6667 WinonaMN-M

6668 WisconsinTM-HP

6669 WoodWI-F

6670 WoodWI-IF

6671 WoodWI-M

6672 WORLD-EQDIST-CYL

6673 WORLD-LL

6674 WORLD-LM-CONIC

6675 WORLD-LM-TAN

6676 WORLD-MERCATOR

6677 WORLD-MILLER

6678 WORLD-ROBINSON

6679 WORLD-SINUSOIDAL

6680 WORLD-VDGRNTN

6681 WrightMN-F

6682 WrightMN-IF

6683 WrightMN-M

6684 WSIG

6685 WV-N

6686 WV-S

6687 WV83-N

6688 WV83-NF

6689 WV83-S

6690 WV83-SF

6691 WVHP-N

6692 WVHP-NF

6693 WVHP-S

6694 WVHP-SF

6695 WY-E

6696 WY-EC

6697 WY-W

6698 WY-WC

6699 WY83-E

6700 WY83-EC

6701 WY83-ECF

6702 WY83-EF

6703 WY83-W

6704 WY83-WC

6705 WY83-WCF

6706 WY83-WF

6707 WYHP-E

6708 WYHP-EC

6709 WYHP-ECF

6710 WYHP-EF

6711 WYHP-W

6712 WYHP-WC

6713 WYHP-WCF

6714 WYHP-WF

6715 Xian80.GK-13

6716 Xian80.GK-14

6717 Xian80.GK-15

6718 Xian80.GK-16

6719 Xian80.GK-17

6720 Xian80.GK-18

6721 Xian80.GK-19

6722 Xian80.GK-20

6723 Xian80.GK-21

6724 Xian80.GK-22

6725 Xian80.GK-23

6726 Xian80.GK/CM-105E

6727 Xian80.GK/CM-111E

6728 Xian80.GK/CM-117E

6729 Xian80.GK/CM-123E

6730 Xian80.GK/CM-129E

6731 Xian80.GK/CM-135E

6732 Xian80.GK/CM-75E

6733 Xian80.GK/CM-81E

6734 Xian80.GK/CM-87E

6735 Xian80.GK/CM-93E

6736 Xian80.GK/CM-99E

6737 Xian80.GK3d-25

6738 Xian80.GK3d-26

6739 Xian80.GK3d-27

6740 Xian80.GK3d-28

6741 Xian80.GK3d-29

6742 Xian80.GK3d-30

6743 Xian80.GK3d-31

6744 Xian80.GK3d-32

6745 Xian80.GK3d-33

6746 Xian80.GK3d-34

6747 Xian80.GK3d-35

6748 Xian80.GK3d-36

6749 Xian80.GK3d-37

6750 Xian80.GK3d-38

6751 Xian80.GK3d-39

6752 Xian80.GK3d-40

6753 Xian80.GK3d-41

6754 Xian80.GK3d-42

6755 Xian80.GK3d-43

6756 Xian80.GK3d-44

6757 Xian80.GK3d-45

6758 Xian80.GK3d/CM-102E

6759 Xian80.GK3d/CM-105E

6760 Xian80.GK3d/CM-108E

6761 Xian80.GK3d/CM-111E

6762 Xian80.GK3d/CM-114E

6763 Xian80.GK3d/CM-117E

6764 Xian80.GK3d/CM-120E

6765 Xian80.GK3d/CM-123E

6766 Xian80.GK3d/CM-126E

6767 Xian80.GK3d/CM-129E

6768 Xian80.GK3d/CM-132E

6769 Xian80.GK3d/CM-135E

6770 Xian80.GK3d/CM-75E

6771 Xian80.GK3d/CM-78E

6772 Xian80.GK3d/CM-81E

6773 Xian80.GK3d/CM-84E

6774 Xian80.GK3d/CM-87E

6775 Xian80.GK3d/CM-90E

6776 Xian80.GK3d/CM-93E

6777 Xian80.GK3d/CM-96E

6778 Xian80.GK3d/CM-99E

6779 Xian80.LL

6780 XY-BC

6781 XY-BL

6782 XY-BR

6783 XY-CA

6784 XY-CC

6785 XY-CF

6786 XY-CG

6787 XY-CL

6788 XY-CM

6789 XY-DAM

6790 XY-DK

6791 XY-DM

6792 XY-FT

6793 XY-FU

6794 XY-GC

6795 XY-GL

6796 XY-GM

6797 XY-HM

6798 XY-IFT

6799 XY-IIN

6800 XY-IMI

6801 XY-IN

6802 XY-IYD

6803 XY-KM

6804 XY-KT

6805 XY-M

6806 XY-MI

6807 XY-ML

6808 XY-MM

6809 XY-NM

6810 XY-PE

6811 XY-PO

6812 XY-RD

6813 XY-RO

6814 XY-SC

6815 XY-SL

6816 XY-SY

6817 XY-UI

6818 XY-YD

6819 Yacare.LL

6820 Yacare/E.LL

6821 YAP

6822 YellowMedicineMN-F

6823 YellowMedicineMN-IF

6824 YellowMedicineMN-M

6825 YEM-E

6826 YEM-GK7

6827 YEM-GK8

6828 YEM-GK9

6829 YEM-U37

6830 YEM-U38

6831 YEM-U39

6832 YEM-W

6833 Yemen96.UTM-38N

6834 Yemen96.UTM-39N

6835 YemenNtl96.LL

6836 ZAIRE

6837 Zanderij.LL

6838 Zanderij.SurinameTM

6839 Zanderij.SurinameTMOld

6840 Zanderij.TM-54NW

6841 Zanderij.UTM-21N

6842 ZIM-35

6843 ZIM-35/01

6844 ZIM-36

6845 ZIM-36/01

So we can see we have 6845 available to us (many more than the list shown inside AutoCAD 2015). I’m not sure how best they’d be filtered or categorised – this isn’t my specialist field, of course – but if someone has a request/suggestion for how to do so, please do post a comment. A lot more can clearly be done with this list than just printing the ID of each entry.

Update:

Mads tells me the library we use contains more coordinate systems than AutoCAD supports: AutoCAD only currently works with “projected” coordinate systems (the others in the list are used by other Autodesk products). You can modify the LCS command to filter the list by wrapping the contents of its foreach loop in an if statement, e.g.:

if (cs.Type == GeoCSType.Projected)

{

  ed.WriteMessage("\n{0} {1}", i, cs.ID);

  ++i;

}

That reduces the list down to 6132 coordinate systems that can be used by AutoCAD. Which should still be plenty, I would think. Thanks, Mads!

September 04, 2014

Translating between AutoCAD drawing points and geographic locations using .NET – Part 2

In yesterday’s post we saw a simple implementation of two commands to translate between geographical locations (latitude-longitude values) and drawing points inside AutoCAD.

In this post we’re extending that to access the current coordinate system, as returned by the GeoLocation object attached to the current drawing. Which in some ways should be simple, but then the CoordinateSystem property actually returns XML data, not just the simple coordinate system name you probably passed in when choosing it (see the IGR command, below, to see what I mean):

<?xml version="1.0" encoding="utf-16" standalone="no" ?>

<Dictionary

  version="1.0"

  xmlns="http://www.osgeo.org/mapguide/coordinatesystem">

  <ProjectedCoordinateSystem id="SWISS">

    <Name>SWISS</Name>

    <Description>

      Deprecated as duplicate of CH1903.LV03/01

    </Description>

    <Authority>Bundesamt fur Landestopographie</Authority>

    <AdditionalInformation>

      <ParameterItem type="CsMap">

        <Key>CSQuadrantSimplified</Key>

        <IntegerValue>1</IntegerValue>

      </ParameterItem>

    </AdditionalInformation>

    <DatumId>CH-1903</DatumId>

    <Axis uom="METER">

      <CoordinateSystemAxis>

        <AxisOrder>1</AxisOrder>

        <AxisName>Easting</AxisName>

        <AxisAbbreviation>E</AxisAbbreviation>

        <AxisDirection>east</AxisDirection>

      </CoordinateSystemAxis>

      <CoordinateSystemAxis>

        <AxisOrder>2</AxisOrder>

        <AxisName>Northing</AxisName>

        <AxisAbbreviation>N</AxisAbbreviation>

        <AxisDirection>north</AxisDirection>

      </CoordinateSystemAxis>

    </Axis>

    <Conversion>

      <Projection>

        <OperationMethodId>

          Swiss Oblique Cylindrical

        </OperationMethodId>

        <ParameterValue>

          <OperationParameterId>

            Longitude of false origin

          </OperationParameterId>

          <Value uom="degree">7.43958333333333</Value>

        </ParameterValue>

        <ParameterValue>

          <OperationParameterId>

            Latitude of false origin

          </OperationParameterId>

          <Value uom="degree">46.9524055555556</Value>

        </ParameterValue>

        <ParameterValue>

          <OperationParameterId>False easting</OperationParameterId>

          <Value uom="METER">600000</Value>

        </ParameterValue>

        <ParameterValue>

          <OperationParameterId>False northing</OperationParameterId>

          <Value uom="METER">200000</Value>

        </ParameterValue>

      </Projection>

    </Conversion>

  </ProjectedCoordinateSystem>

  <GeodeticDatum id="CH-1903">

    <Name>CH-1903</Name>

    <Description>

      Swiss National Geodetic System Aug 1990, Switzerland (7 Param)

    </Description>

    <Authority>Bundesamt fur Landestopographie</Authority>

    <PrimeMeridianId>Greenwich</PrimeMeridianId>

    <EllipsoidId>BESSEL</EllipsoidId>

  </GeodeticDatum>

  <Alias id="6801" type="Datum">

    <ObjectId>CH-1903</ObjectId>

    <Namespace>EPSG Code</Namespace>

  </Alias>

  <Ellipsoid id="BESSEL">

    <Name>BESSEL</Name>

    <Description>Bessel, 1841</Description>

    <Authority>

      US Defense Mapping Agency, TR-8350.2-B, December 1987

    </Authority>

    <SemiMajorAxis uom="meter">6377397.155</SemiMajorAxis>

    <SecondDefiningParameter>

      <SemiMinorAxis uom="meter">6356078.963</SemiMinorAxis>

    </SecondDefiningParameter>

  </Ellipsoid>

  <Alias id="7004" type="Ellipsoid">

    <ObjectId>BESSEL</ObjectId>

    <Namespace>EPSG Code</Namespace>

  </Alias>

  <Transformation id="CH-1903_to_WGS84">

    <Name>CH-1903_to_WGS84</Name>

    <Description>

      Swiss National Geodetic System Aug 1990, Switzerland (7 Param)

    </Description>

    <Authority>Bundesamt fur Landestopographie</Authority>

    <CoordinateOperationAccuracy>

      <Accuracy uom="meter">3</Accuracy>

    </CoordinateOperationAccuracy>

    <SourceDatumId>CH-1903</SourceDatumId>

    <TargetDatumId>WGS84</TargetDatumId>

    <IsReversible>true</IsReversible>

    <OperationMethod>

      <OperationMethodId>

        Seven Parameter Transformation

      </OperationMethodId>

      <ParameterValue>

        <OperationParameterId>

          X-axis translation

        </OperationParameterId>

        <Value uom="meter">660.077</Value>

      </ParameterValue>

      <ParameterValue>

        <OperationParameterId>

          Y-axis translation

        </OperationParameterId>

        <Value uom="meter">13.551</Value>

      </ParameterValue>

      <ParameterValue>

        <OperationParameterId>

          Z-axis translation

        </OperationParameterId>

        <Value uom="meter">369.344</Value>

      </ParameterValue>

      <ParameterValue>

        <OperationParameterId>

          X-axis rotation

        </OperationParameterId>

        <Value uom="degree">0.00022356</Value>

      </ParameterValue>

      <ParameterValue>

        <OperationParameterId>

          Y-axis rotation

        </OperationParameterId>

        <Value uom="degree">0.00016047</Value>

      </ParameterValue>

      <ParameterValue>

        <OperationParameterId>

          Z-axis rotation

        </OperationParameterId>

        <Value uom="degree">0.00026451</Value>

      </ParameterValue>

      <ParameterValue>

        <OperationParameterId>

          Scale difference

        </OperationParameterId>

        <Value uom="unity">5.66e-006</Value>

      </ParameterValue>

    </OperationMethod>

  </Transformation>

</Dictionary>

While I’m sure this contains some great information, in our case we’re just after the name of the chosen coordinate system (i.e. “SWISS”).

To access this, we’ve integrated some code from here and here, so that we can use a dynamic object to navigate down through the XML DOM and we don’t have to worry about namespace prefixes (which would otherwise be the case with the above XML data, for instance).

That allows us to access the contents of the coordinate system XML much more easily. Here’s how we access the “id” attribute of the ProjectedCoordinateSystem element, for instance:

csx.ProjectedCoordinateSystem.id;

The same approach might be used to access whatever else you’re interested in from the XML coordinate system information, of course.

Here’s a Screencast of the updated commands in action:




Here’s the C# code that includes the XML parsing code as well as the updated LLFP and PFLL commands:

using Autodesk.AutoCAD.ApplicationServices;

using Autodesk.AutoCAD.DatabaseServices;

using Autodesk.AutoCAD.EditorInput;

using Autodesk.AutoCAD.Geometry;

using Autodesk.AutoCAD.Runtime;

using Microsoft.CSharp.RuntimeBinder;

using System;

using System.Dynamic;

using System.Linq;

using System.Text.RegularExpressions;

using System.Xml.Linq;

 

namespace GeoLocationAPI

{

  public class Commands

  {

    [CommandMethod("IGR")]

    public void InsertGeoRef()

    {

      var doc = Application.DocumentManager.MdiActiveDocument;

      if (doc == null)

        return;

      var ed = doc.Editor;

      var db = doc.Database;

      var msId = SymbolUtilityServices.GetBlockModelSpaceId(db);

 

      if (HasGeoData(db))

      {

        // Report and return: could also open the object for

        // write and modify its properties, of course

 

        ed.WriteMessage("\nDrawing already has geo-location data!");

        return;

      }

 

      // Let's create some geolocation data for this drawing,

      // using a handy method to add it to the modelspace

      // (it gets added to the extension dictionary)

 

      var data = new GeoLocationData();

      data.BlockTableRecordId = msId;

      data.PostToDb();

 

      // We're going to define our geolocation in terms of

      // latitude/longitude using the Mercator projection

      // http://en.wikipedia.org/wiki/Mercator_projection

 

      data.CoordinateSystem = "WORLD-MERCATOR";

      data.TypeOfCoordinates = TypeOfCoordinates.CoordinateTypeGrid;

 

      // Use the lat-long for La Tene, my local "beach"

      // (it's on a lake, after all :-)     

 

      var geoPt = new Point3d(7.019438, 47.005247, 0);

 

      // Transform from a geographic to a modelspace point

      // and add the information to our geolocation data

 

      var wcsPt = data.TransformFromLonLatAlt(geoPt);

      data.DesignPoint = wcsPt;

      data.ReferencePoint = geoPt;

 

      // Let's launch the GEOMAP command to show our geographic

      // overlay

 

      ed.Command("_.GEOMAP", "_AERIAL");

 

      // Now we'll add a circle around our location

      // and that will provide the extents for our zoom

 

      using (var tr = db.TransactionManager.StartTransaction())

      {

        var ms =

          tr.GetObject(msId, OpenMode.ForWrite) as BlockTableRecord;

        if (ms != null)

        {

          // Add a red circle of 7K units radius

          // centred on our point

 

          var circle = new Circle(wcsPt, Vector3d.ZAxis, 7000);

          circle.ColorIndex = 1;

          ms.AppendEntity(circle);

          tr.AddNewlyCreatedDBObject(circle, true);

        }

        tr.Commit();

      }

 

      // And we'll zoom to the circle's extents

 

      ed.Command("_.ZOOM", "_OBJECT", "_L", "");

    }

 

    [CommandMethod("CGI")]

    public void CreateGeoMapImage()

    {

      var doc = Application.DocumentManager.MdiActiveDocument;

      if (doc == null)

        return;

      var ed = doc.Editor;

      var db = doc.Database;

 

      // Get the first corner of our area to convert to a

      // GeomapImage

 

      var ppo = new PromptPointOptions("\nSpecify first corner");

      var ppr = ed.GetPoint(ppo);

      if (ppr.Status != PromptStatus.OK)

        return;

 

      var first = ppr.Value;

 

      // And get the second point as a corner (to rubber-band

      // the selection)

 

      var pco =

        new PromptCornerOptions("\nSpecify second corner", first);

      ppr = ed.GetCorner(pco);

 

      if (ppr.Status != PromptStatus.OK)

        return;

 

      var second = ppr.Value;

 

      // We'll use an event handler on the Database to check for

      // GeomapImage entities being added

      // (we'll use a lambda but assigned to a variable to be

      // able to remove it, afterwards)

 

      ObjectId giId = ObjectId.Null;

      ObjectEventHandler handler =

        (s, e) =>

        {

          if (e.DBObject is GeomapImage)

          {

            giId = e.DBObject.ObjectId;

          }

        };

 

      // Simply call the GEOMAPIMAGE command with the two points

 

      db.ObjectAppended += handler;

      ed.Command("_.GEOMAPIMAGE", first, second);

      db.ObjectAppended -= handler;

 

      // Only continue if we've collected a valid ObjectId

 

      if (giId == ObjectId.Null)

        return;

 

      // Open the entity and change some values

 

      try

      {

        using (var tr = doc.TransactionManager.StartTransaction())

        {

          // Get each object and check if it's a GeomapImage

 

          var gi =

            tr.GetObject(giId, OpenMode.ForWrite) as GeomapImage;

          if (gi != null)

          {

            // Let's adjust the brightmess/contrast/fade of the

            // GeomapImage

 

            gi.Brightness = 90;

            gi.Contrast = 40;

            gi.Fade = 20;

 

            // And make sure it's at the right resolution and

            // shows both aerial and road information

 

            gi.Resolution = GeomapResolution.Optimal;

            gi.MapType = GeomapType.Hybrid;

 

            gi.UpdateMapImage(true);

          }

 

          tr.Commit();

        }

      }

      catch (Autodesk.AutoCAD.Runtime.Exception)

      {

        ed.WriteMessage(

          "\nUnable to update geomap image entity." +

          "\nPlease check your internet connectivity and call " +

          "GEOMAPIMAGEUPDATE."

        );

      }

    }

 

    [CommandMethod("LLFP")]

    public void LatLongFromPoint()

    {

      var doc = Application.DocumentManager.MdiActiveDocument;

      if (doc == null)

        return;

      var ed = doc.Editor;

      var db = doc.Database;

 

      if (!HasGeoData(db))

      {

        ed.WriteMessage(

          "\nCurrent drawing has no geo-location information."

        );

        return;

      }

 

      // Get the drawing point to be translated into a lat-lon

 

      var ppo = new PromptPointOptions("\nSpecify point");

      var ppr = ed.GetPoint(ppo);

      if (ppr.Status != PromptStatus.OK)

        return;

 

      var dwgPt = ppr.Value;

 

      // Translate the drawing point to a lat-lon

 

      var res = TranslateGeoPoint(db, dwgPt, true);

 

      // Print any coordinate system information

 

      PrintCoordinateSystem(ed, res.Item2);

 

      // And then the point itself

 

      var lonlat = res.Item1;

      ed.WriteMessage(

        "\nLatitude-longitude is {0},{1}", lonlat.Y, lonlat.X

      );

    }

 

    [CommandMethod("PFLL")]

    public void PointFromLatLong()

    {

      var doc = Application.DocumentManager.MdiActiveDocument;

      if (doc == null)

        return;

      var ed = doc.Editor;

      var db = doc.Database;

 

      if (!HasGeoData(db))

      {

        ed.WriteMessage(

          "\nCurrent drawing has no geo-location information."

        );

        return;

      }

 

      // Get the latitude and longitude to be translated

      // to a drawing point

 

      var pdo = new PromptDoubleOptions("\nEnter latitude");

      var pdr = ed.GetDouble(pdo);

      if (pdr.Status != PromptStatus.OK)

        return;

 

      var lat = pdr.Value;

 

      pdo.Message = "\nEnter longitude";

      pdr = ed.GetDouble(pdo);

      if (pdr.Status != PromptStatus.OK)

        return;

 

      var lon = pdr.Value;

 

      var lonlat = new Point3d(lon, lat, 0.0);

 

      // Translate the lat-lon to a drawing point

 

      var res = TranslateGeoPoint(db, lonlat, false);

 

      // Print any coordinate system information

 

      ed.WriteMessage(res.Item2);

 

      PrintCoordinateSystem(ed, res.Item2);

 

      // And then the point itself

 

      var dwgPt = res.Item1;

      ed.WriteMessage(

        "\nDrawing point is {0},{1},{2}", dwgPt.X, dwgPt.Y, dwgPt.Z

      );

    }

 

    private static void PrintCoordinateSystem(Editor ed, string xml)

    {

      try

      {

        dynamic csx = DynamicXml.Parse(xml);

        var cs = csx.ProjectedCoordinateSystem.id;

        ed.WriteMessage("\nCoordinate system: {0}", cs);

      }

      catch (RuntimeBinderException)

      {

        ed.WriteMessage("\nNo coordinate system information.");

      }

    }

 

    private Tuple<Point3d, string> TranslateGeoPoint(

      Database db, Point3d inPt, bool fromDwg

    )

    {

      using (

        var tr = db.TransactionManager.StartOpenCloseTransaction()

      )

      {

        // Get the drawing's GeoLocation object

 

        var gd =

          tr.GetObject(db.GeoDataObject, OpenMode.ForRead)

            as GeoLocationData;

 

        // Get the output point...

        // dwg2lonlat if fromDwg is true,

        // lonlat2dwg otherwise

 

        var outPt =

          (fromDwg ?

            gd.TransformToLonLatAlt(inPt) :

            gd.TransformFromLonLatAlt(inPt)

          );

 

        var cs = gd.CoordinateSystem;

 

        tr.Commit();

 

        return new Tuple<Point3d, string>(outPt, cs);

      }

    }

 

    private static bool HasGeoData(Database db)

    {

      // Check whether the drawing already has geolocation data

 

      bool hasGeoData = false;

      try

      {

        var gdId = db.GeoDataObject;

        hasGeoData = true;

      }

      catch { }

      return hasGeoData;

    }

  }

 

  // From

  // http://stackoverflow.com/questions/13704752/

  // deserialize-xml-to-object-using-dynamic

  // and

  // http://social.msdn.microsoft.com/Forums/en-US/

  // bed57335-827a-4731-b6da-a7636ac29f21/xdocument-remove-namespace

 

  public class DynamicXml : DynamicObject

  {

    XElement _root;

    private DynamicXml(XElement root)

    {

      _root = root;

    }

 

    public static DynamicXml Parse(

      string xmlString, bool stripNamespaces = true

    )

    {

      var doc = XDocument.Parse(xmlString);

      if (stripNamespaces)

      {

        doc = StripDocumentNamespaces(doc);

      }

      return new DynamicXml(doc.Root);

    }

 

    public static DynamicXml Load(

      string filename, bool stripNamespaces = true

    )

    {

      var doc = XDocument.Load(filename);

      if (stripNamespaces)

      {

        doc = StripDocumentNamespaces(doc);

      }

      return new DynamicXml(doc.Root);

    }

 

    private static XDocument StripDocumentNamespaces(

      XDocument doc

    )

    {

      // Remove all xmlns:* instances from the passed XDocument

 

      return

        XDocument.Parse(

          Regex.Replace(

            doc.ToString(), @"(xmlns:?[^=]*=[""][^""]*[""])", "",

            RegexOptions.IgnoreCase | RegexOptions.Multiline

          )

        );

    }

 

    public override bool TryGetMember(

      GetMemberBinder binder, out object result

    )

    {

      result = null;

 

      // Do we have an attribute with this name?

 

      var att = _root.Attribute(binder.Name);

      if (att != null)

      {

        result = att.Value;

        return true;

      }

 

      // Do we have a list of elements with this name?

 

      var nodes = _root.Elements(binder.Name);

      if (nodes.Count() > 1)

      {

        result = nodes.Select(n => new DynamicXml(n)).ToList();

        return true;

      }

 

      // Do we have a single element with this name?

 

      var node = _root.Element(binder.Name);

      if (node != null)

      {

        if (node.HasElements)

        {

          result = new DynamicXml(node);

        }

        else

        {

          result = node.Value;

        }

        return true;

      }

 

      return true;

    }

  }

}

If you’re attending AU 2014 and are interested in learning more about this API. I recommend signing up for a class being delivered by my friend and colleague, Mads Paulin: Building Location-Aware Applications in AutoCAD 2015. Mads is the software architect on the AutoCAD team who is responsible for this feature, so it’s a great opportunity to get information straight from the source. (Unfortunately this session conflicts with a class at which I’m co-speaking, otherwise I’d be there, too.)

Translating between AutoCAD drawing points and geographic locations using .NET – Part 1

I received a question from Coralie Jacobi, recently, in response to this recent post:

Saw your post a while ago on the geolocation in 2015. This functionality is something that we will use a great deal and I will definitely be writing some code to utilize it. What I’d like to see in your blog is something that would show me how to get to the lat\long values of the location I have picked and convert them to the coordinate system that I have selected instead of having the user have to stipulate the location in AutoCAD.

In this post that’s just what we’re going to take a look at: using the GeoLocation object attached to a drawing to translate between drawing points and lat-long values (and vice-versa). The capability is there – assuming a drawing has been geo-located – and is very straightforward to use. We’ll create a simple utility function to check the existence of a geo-location object and another to do the actual translation. The translation itself is extremely easy to perform: it’s a simple matter of choosing which of two methods to call (i.e. in which direction to do the translation).

The translation is based on the currently used coordinate system, which you can also via the GeoLocation object. In fact in tomorrow’s post we’re going to see just that: how we can retrieve information about the current coordinate system (in our case we’re going to focus on the name, but the code will be easy to adapt to get other information).

Let’s start with a quick Screencast of the code in action:




Here’s the C# code that implements the new helper functions and two commands that use them: LLFP and PFLL (for LatLongFromPoint and PointFromLatLong, respectively). We might also have exposed the underlying translation helper to LISP, which would make it easier to call from the command-line, but that’s been left as an exercise for the reader. The implementation includes the previous commands to add a geographic location to the current drawing as well as to create an embedded image of the current geography.

using Autodesk.AutoCAD.ApplicationServices;

using Autodesk.AutoCAD.DatabaseServices;

using Autodesk.AutoCAD.EditorInput;

using Autodesk.AutoCAD.Geometry;

using Autodesk.AutoCAD.Runtime;

 

namespace GeoLocationAPI

{

  public class Commands

  {

    [CommandMethod("IGR")]

    public void InsertGeoRef()

    {

      var doc = Application.DocumentManager.MdiActiveDocument;

      if (doc == null)

        return;

      var ed = doc.Editor;

      var db = doc.Database;

      var msId = SymbolUtilityServices.GetBlockModelSpaceId(db);

 

      if (HasGeoData(db))

      {

        // Report and return: could also open the object for

        // write and modify its properties, of course

 

        ed.WriteMessage("\nDrawing already has geo-location data!");

        return;

      }

 

      // Let's create some geolocation data for this drawing,

      // using a handy method to add it to the modelspace

      // (it gets added to the extension dictionary)

 

      var data = new GeoLocationData();

      data.BlockTableRecordId = msId;

      data.PostToDb();

 

      // We're going to define our geolocation in terms of

      // latitude/longitude using the Mercator projection

      // http://en.wikipedia.org/wiki/Mercator_projection

 

      data.CoordinateSystem = "WORLD-MERCATOR";

      data.TypeOfCoordinates = TypeOfCoordinates.CoordinateTypeGrid;

 

      // Use the lat-long for La Tene, my local "beach"

      // (it's on a lake, after all :-)     

 

      var geoPt = new Point3d(7.019438, 47.005247, 0);

 

      // Transform from a geographic to a modelspace point

      // and add the information to our geolocation data

 

      var wcsPt = data.TransformFromLonLatAlt(geoPt);

      data.DesignPoint = wcsPt;

      data.ReferencePoint = geoPt;

 

      // Let's launch the GEOMAP command to show our geographic

      // overlay

 

      ed.Command("_.GEOMAP", "_AERIAL");

 

      // Now we'll add a circle around our location

      // and that will provide the extents for our zoom

 

      using (var tr = db.TransactionManager.StartTransaction())

      {

        var ms =

          tr.GetObject(msId, OpenMode.ForWrite) as BlockTableRecord;

        if (ms != null)

        {

          // Add a red circle of 7K units radius

          // centred on our point

 

          var circle = new Circle(wcsPt, Vector3d.ZAxis, 7000);

          circle.ColorIndex = 1;

          ms.AppendEntity(circle);

          tr.AddNewlyCreatedDBObject(circle, true);

        }

        tr.Commit();

      }

 

      // And we'll zoom to the circle's extents

 

      ed.Command("_.ZOOM", "_OBJECT", "_L", "");

    }

 

    [CommandMethod("CGI")]

    public void CreateGeoMapImage()

    {

      var doc = Application.DocumentManager.MdiActiveDocument;

      if (doc == null)

        return;

      var ed = doc.Editor;

      var db = doc.Database;

 

      // Get the first corner of our area to convert to a

      // GeomapImage

 

      var ppo = new PromptPointOptions("\nSpecify first corner");

      var ppr = ed.GetPoint(ppo);

      if (ppr.Status != PromptStatus.OK)

        return;

 

      var first = ppr.Value;

 

      // And get the second point as a corner (to rubber-band

      // the selection)

 

      var pco =

        new PromptCornerOptions("\nSpecify second corner", first);

      ppr = ed.GetCorner(pco);

 

      if (ppr.Status != PromptStatus.OK)

        return;

 

      var second = ppr.Value;

 

      // We'll use an event handler on the Database to check for

      // GeomapImage entities being added

      // (we'll use a lambda but assigned to a variable to be

      // able to remove it, afterwards)

 

      ObjectId giId = ObjectId.Null;

      ObjectEventHandler handler =

        (s, e) =>

        {

          if (e.DBObject is GeomapImage)

          {

            giId = e.DBObject.ObjectId;

          }

        };

 

      // Simply call the GEOMAPIMAGE command with the two points

 

      db.ObjectAppended += handler;

      ed.Command("_.GEOMAPIMAGE", first, second);

      db.ObjectAppended -= handler;

 

      // Only continue if we've collected a valid ObjectId

 

      if (giId == ObjectId.Null)

        return;

 

      // Open the entity and change some values

 

      try

      {

        using (var tr = doc.TransactionManager.StartTransaction())

        {

          // Get each object and check if it's a GeomapImage

 

          var gi =

            tr.GetObject(giId, OpenMode.ForWrite) as GeomapImage;

          if (gi != null)

          {

            // Let's adjust the brightmess/contrast/fade of the

            // GeomapImage

 

            gi.Brightness = 90;

            gi.Contrast = 40;

            gi.Fade = 20;

 

            // And make sure it's at the right resolution and

            // shows both aerial and road information

 

            gi.Resolution = GeomapResolution.Optimal;

            gi.MapType = GeomapType.Hybrid;

 

            gi.UpdateMapImage(true);

          }

 

          tr.Commit();

        }

      }

      catch (Autodesk.AutoCAD.Runtime.Exception)

      {

        ed.WriteMessage(

          "\nUnable to update geomap image entity." +

          "\nPlease check your internet connectivity and call " +

          "GEOMAPIMAGEUPDATE."

        );

      }

    }

 

    [CommandMethod("LLFP")]

    public void LatLongFromPoint()

    {

      var doc = Application.DocumentManager.MdiActiveDocument;

      if (doc == null)

        return;

      var ed = doc.Editor;

      var db = doc.Database;

 

      if (!HasGeoData(db))

      {

        ed.WriteMessage(

          "\nCurrent drawing has no geo-location information."

        );

        return;

      }

 

      // Get the drawing point to be translated into a lat-lon

 

      var ppo = new PromptPointOptions("\nSpecify point");

      var ppr = ed.GetPoint(ppo);

      if (ppr.Status != PromptStatus.OK)

        return;

 

      var dwgPt = ppr.Value;

 

      // Translate the drawing point to a lat-lon

 

      var lonlat = TranslateGeoPoint(db, dwgPt, true);

 

      ed.WriteMessage(

        "\nLatitude-longitude is {0},{1}", lonlat.Y, lonlat.X

      );

    }

 

    [CommandMethod("PFLL")]

    public void PointFromLatLong()

    {

      var doc = Application.DocumentManager.MdiActiveDocument;

      if (doc == null)

        return;

      var ed = doc.Editor;

      var db = doc.Database;

 

      if (!HasGeoData(db))

      {

        ed.WriteMessage(

          "\nCurrent drawing has no geo-location information."

        );

        return;

      }

 

      // Get the latitude and longitude to be translated

      // to a drawing point

 

      var pdo = new PromptDoubleOptions("\nEnter latitude");

      var pdr = ed.GetDouble(pdo);

      if (pdr.Status != PromptStatus.OK)

        return;

 

      var lat = pdr.Value;

 

      pdo.Message = "\nEnter longitude";

      pdr = ed.GetDouble(pdo);

      if (pdr.Status != PromptStatus.OK)

        return;

 

      var lon = pdr.Value;

 

      var lonlat = new Point3d(lon, lat, 0.0);

 

      // Translate the lat-lon to a drawing point

 

      var dwgPt = TranslateGeoPoint(db, lonlat, false);

 

      ed.WriteMessage(

        "\nDrawing point is {0},{1},{2}", dwgPt.X, dwgPt.Y, dwgPt.Z

      );

    }

 

    private Point3d TranslateGeoPoint(

      Database db, Point3d inPt, bool fromDwg

    )

    {

      using (

        var tr = db.TransactionManager.StartOpenCloseTransaction()

      )

      {

        // Get the drawing's GeoLocation object

 

        var gd =

          tr.GetObject(db.GeoDataObject, OpenMode.ForRead)

            as GeoLocationData;

 

        // Get the output point...

        // dwg2lonlat if fromDwg is true,

        // lonlat2dwg otherwise

 

        var outPt =

          (fromDwg ?

            gd.TransformToLonLatAlt(inPt) :

            gd.TransformFromLonLatAlt(inPt)

          );

        tr.Commit();

 

        return outPt;

      }

    }

 

    private static bool HasGeoData(Database db)

    {

      // Check whether the drawing already has geolocation data

 

      bool hasGeoData = false;

      try

      {

        var gdId = db.GeoDataObject;

        hasGeoData = true;

      }

      catch { }

      return hasGeoData;

    }

  }

}

In the next post we’ll build on this code to get more information about the chosen coordinate system.

September 02, 2014

My AU 2014 schedule

AU2014



Now that registration is open for Autodesk University 2014, people are busy signing up for classes. For those of you who are curious about the classes I’m delivering/hosting/attending at this year’s event, here they are. I’ll break things down day-by-day, in case you’re interested in finding an opportunity to meet up but can’t attend one of my sessions.

Monday (Dec 1st)

I’ll be attending the ADN DevDay, all day. Always lots of great information to absorb there, of course (jetlag permitting ;-).

Tuesday (Dec 2nd)

I’ll be hanging out at the ADN DevHack for most of the afternoon, although I will be heading across to host the ever-popular Meet the AutoCAD API Experts panel session (yes, Stephen Preston asked me to stand in for him, this year) from 3-4pm. I’m also planning to attend my teammate Albert Szilvasy’s AutoCAD Core Engine via HTTP session at 5-6pm: I’ve seen most of Albert’s presentation delivered at an internal conference, but I’m interested in hearing more about recent progress that has been made as well as the audience’s thoughts on how this service might be used in practice.

Wednesday (Dec 3rd)

This is my busiest day, class-wise. I’ll be presenting a session entitled Connecting AutoCAD to the Web with HTML5 and JavaScript from 10-11:30am, and then another session on Using SensorTag as a Low-Cost Sensor Array for AutoCAD from 3-4pm. By which point I’ll be ready for a beer or three at the AUGI reception, no doubt.

Thursday (Dec 4th)

On the last day of the conference I’ll be co-presenting a session on Configuring Morgan 3 Wheelers at the Geneva Motor Show using Autodesk VRED at 1-2pm with my colleague Jason Walters (he was on the team who developed the configurator we’ll be showing). It should be a really fun way to finish up the conference! Well, that and the closing party, of course.

A usual I’m very much looking forward to this year’s AU. It’s always a great opportunity to meet new people, catch up with old friends and learn more about how customers and developers are using Autodesk technology to do important – and often incredible – things. If you’re interested in talking about your use of Autodesk technology – maybe you’re having challenges that future posts on this blog might be able to help with, for instance – then please let me know. I’m sure we can find some time to sit down and chat.

To close, here’s my personal tag-line for the conference:

“ExhAUsting yet invigAUrating, come to AU!”

(And that’s why you shouldn’t let technical people do marketing. ;-)

August 28, 2014

AutoCAD integration samples for Kinect for Windows 2 –Part 4

After introducing the series, taking a look at some basic samples and then looking at importing Kinect’s high-definition face tracking data into AutoCAD, it’s time for (in my opinion) the most interesting piece of functionality provided on the Kinect SDK, Kinect Fusion.

Kinect Fusion is a straightforward way to capture 3D volumes – allowing you to move the Kinect sensor around to capture objects from different angles – and the KINFUS command in these integration samples let’s you bring the captured data into AutoCAD. Which basically turns Kinect into a low-cost – and reasonably effective – 3D scanner for AutoCAD.

The KINFUS command (I’ve now done away with having a monochrome KINFUS command and a colour KINFUSCOL command… KINFUS now just creates captures with colour) hosts the Kinect Fusion runtime component and provides visual feedback on the volume as you map it, finally giving you options for bringing the data into your active drawing. Much as I’ve talked about in the past for KfW v1, although this version has a few differences.

Firstly, as mentioned recently, the KfW v2 implementation of Kinect Fusion is much more stable: while with v1 it was not really viable to effectively run Kinect Fusion within a 3D design app – the processing lag when marshaling 3D data made it close to unusable, at least in my experience – with v2 things are much better. It’s quite possible that much of this improvement stems from the use of a “camera pose” database, which makes tracking between frames much more reliable.

Using Kinect Fusion for “reality capture” is still some way off being a completely polished experience, but for someone with appropriate expectations – and who’s prepared to put up with a certain amount of frustration – it can be worthwhile.

Here’s a snapshot of a capture of a car we saw in a previous post:

Kinect Fusion on a car

With previous incarnations of this integration, the main way to bring the captured data into AutoCAD has been as a point cloud, mainly as we’re dealing with large volumes of data (captures of 2-3 million points are common, for instance). For fun, though – and because the point cloud import story isn’t as good with AutoCAD 2015, as you now have to index .XYZ files in ReCap Studio before attaching the .RCP inside AutoCAD – I went ahead and implemented mesh import, too.

So the KINFUS command now gives you the option to bring in a mesh rather than a point cloud, once you’ve finished capturing the volume and have selected the voxel step (which allows you to reduce the size of the data you bring in).

A quick word of caution, though. Point clouds – even if they’re written to a text file for indexing – can basically be as big as you want, AutoCAD will manage to bring them in without any problem. Meshes, on the other hand, are a different matter. If you don’t care about colour, you can create a SubDMesh object that has up to about 2 million vertices (at least that’s what I’ve found). Beyond that and the mesh creation will almost certainly fail.

And if you want to capture a coloured mesh – which I have to admit is pretty awesome, see below for an example – you shouldn’t expect to go beyond 300K vertices: the creation of the SubDMesh should work fine, but AutoCAD will take forever to apply the per-vertex colours on the mesh (if it even manages to do so).

A little narcissism comes with the territory when you’re a blogger, so I chose myself as the subject of my initial Kinect Fusion mesh import into AutoCAD. Well, that and the Kinect was pointing at me and I didn’t bother pointing it elsewhere.

As you can imagine, I was pretty excited when I first managed to get colours imported. Here’s a little tale of how things looked…

The mesh was first displayed in AutoCAD with the “conceptual” visual style, but it was too edgy:

Conceptual selfie

I switched the visual style to “realistic”, but it was too shiny:

Realistic selfie

And finally I created my own visual style which was “just right”:

Better selfie

Just to prove it is indeed, 3D, here’s a quick GIF (although reduced to 256 colours, of course);

3D selfie

Again, this mesh has 286K vertices, so it’s fairly small by Kinect Fusion’s standards: I chose a 1m3 capture volume with a voxel density of 256 per metre and then didn’t even move the camera before completing the capture. In case you want to check it out, here’s the 10 MB drawing file.

I do believe it’s possible to use Kinect Fusion to capture complete objects – if enough care is taken not to move the sensor too quickly, and you’re sensible about not trying to capture too much at once. Once tracking fails, I often find the results to become a little unpredictable: you might find you have multiple ground planes at different angles, for instance. It’s natural to want to capture as much as possible in one go, but I’ve found that performing multiple captures that you later aggregate ends up being more efficient. Yes, this is much easier with point clouds that with meshes, of course.

I should mention that the system I’m using is a few years old and a “mobile workstation” (i.e. a high-powered notebook) rather than a full desktop. Here are the results from the Kinect v2 Configuration Verifier:

Stuggling to maintain framerate

I would hope that using a full desktop – or a newer/beefier notebook – would improve tracking and therefore the quality of the results you can generate using Kinect Fusion with AutoCAD. It’s nonetheless interesting to see how well things are moving ahead, though, and this is only with a preview release of the Kinect SDK.

Feed/Share

10 Random Posts