Through the Interface

February 2015

Sun Mon Tue Wed Thu Fri Sat
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28



Twitter





February 04, 2015

Building a Death Star with AutoCAD and 123D Make at the Fab Lab

So yes, I like Star Wars. And my kids like Star Wars, too. In my office downstairs at home, I have the pride and joy of my modest collection of Star Wars-related goodies, a LEGO Death Star:

My LEGO Death Star

It pains me to leave it downstairs and only let the kids play with it under supervision, but it was a hassle (albeit a very enjoyable hassle) to build and I’d hate to see it damaged. And if anyone thinks that makes me sound like Will Ferrell’s character in the LEGO Movie, they’d be right: there’s a lot that resonated in that film, sadly. But hey, that’s how it is. Just call me Lord Business.

In an attempt to assuage my feeling of guilt, I decided to create a better Death Star for my kids – for one of my sons in particular – to use with (or display, as they wish) their own LEGO Star Wars figures. It seemed like a great opportunity to have a play with 123D Make (and, of course, use AutoCAD in the mix), as well as to execute my first real project at the local Fab Lab.

Of course I started by modelling my Death Star in AutoCAD. Here’s the sequence of commands (which you should be able to save as a script or copy & paste into the command-line) I used to do it:

_.sphere 0,0,0 10

_.select l

_.sphere 6.2,6.2,6.2 4.5

_.subtract p  l

_.move l  0,0,0 0,0,10

_.select l

_.box -5,-5,0 5,5,1

_.subtract p  l

_.move l  0,0,1 0,0,0

I didn’t look for actual dimensions, mind… I’m not quite that anal: I did what looks OK. And I flattened off the bottom to help it not roll away, something that wouldn’t have been needed in outer space.


Death Star

I used the STLOUT command to create an STL file of the resulting solid.

Over in 123D Make – which I went ahead and updated… the version on my system was 4 years old, already! – I imported the STL file. The model came in quite facetted – I’d made the rookie mistake of leaving FACETRES at the default of 0.5, rather than upping it to 10 – but I’ll know for the next one I make.

The Death Golf Ball

Looking at design options… I liked the “interlocked slices” option, the downside being access to the holes to place the minifigures, etc.:

One design option

I ended up opting for “radial slices”, which allowed for much easier access:

The final option I chose

The sheets of MDF we had available were 430x730mm, which meant we’d need 5 to complete a Death Star that was 40cm high, making it basically the same size as the LEGO original.

With the sheet layout

The cutting was done on a Trotec Speedy 300:

The hardware

I saved the plans from 123D Make into EPS format and imported those into Illustrator to print to the Trotec printer driver. Beam frequencies, etc., are all set by colour. Super-clunky, but effective enough. (I can see why projects such as this one on Labs for the Coherent METABEAM 400 are happening, though.)


Laser cutting

Once the pieces were cut, it was time to build…

The cut pieces

The Fab Lab manager, Gaëtan Bussy, had suggested reducing the material thickness to 3.8mm (from 4mm) inside 123D Make, to reduce the play in the built piece. This was great advice, and certainly made the fit of the joints extremely snug. To make it easier to get the pieces fitting, my wife suggested using soap (I’d been about to sand them slightly), something she’d apparently learned making models as a child. More great advice. :-)

The first vertical slice was a pain to get in

The last one was much easier

 

 

 

 

 

 

 

 

 

 

 

 

 

For a first attempt, I was really happy with the results. The best bit, of course, was seeing my son’s face when he got home, this evening. Followed closely by seeing him fill it with his minifigures. :-)

My son playing with his new Death Star

Many thanks to Gaëtan Bussy and Fab Lab Neuchatel for helping make my son so happy.

Update:

Here’s the 123D Make project for the Death Star (this one is less facetted than the one I ended up building), in case anyone wants to build their own.


February 02, 2015

REAL 2015 in San Francisco from February 25-27

REAL 2015

There’s a really interesting conference in San Francisco, later this month (groan). It’s being held in Fort Mason on February 25-27, and covers all aspects of reality computing, whether relating to capture, compute or create:

Real Conference 2015Autodesk is the main organizer of the event, but there are lots of other companies and institutions involved, too. Check out the list of speakers: there are executives and specialists from Autodesk lined up as well as several eminent names from the industry.

Here's a quick video giving a sense of what the conference is all about:



And in case you’re still interested in registering, it seems TenLinks has a 25% discount on offer:

 


I was planning on heading across to attend the conference, but then realised I already had vacation booked for that week. Hopefully it’s an event that will happen again in future years.

January 30, 2015

Using NFC to launch your Google Cardboard app

When I first got my Google Cardboard toolkit – in my case the DODOcase VR Toolkit V1 – I was fascinated by the NFC tag that came with it. I hadn’t really played with NFC, prior to that, and only had a very vague idea of what it was about.

It turns out that NFC is this important new(ish) technology that’s enabling all kinds of local (hence “near-field”) communications. Especially, it seems, in the area of mobile payments with technologies such as Apple Pay. Anyway, my first exposure to NFC was via this business card-sized sticker that came in the VR Toolkit.

Here’s that sticker held up to the light, so you can see its integrated circuit:

The DODOcase NFC sticker

As a technology NFC relies on some underlying RFID standards. Which is great because it allows me to digress and talk about how RFID was used recently at Autodesk University 2014. :-)

Here’s a picture of my AU2014 badge, held up to the same light as the above NFC chip:

My AU 2014 badge, held up to the light

Between the AU logo and my name, you’ll see another printed circuit. This was an RFID tag that allowed the event organizers to track the movement of people between the various AU sessions. People were able to opt out of this at registration, apparently, although I wasn’t aware it was possible until the topic came up during my SensorTag class (where we had a really good discussion in the room about possible uses for sensor technology). I was aware it was happening, though, and I even signed up for access to the internal dashboard.

I can tell you, for instance, that Wednesday’s exhibit hall was the best attended single event of the conference, followed closely by the opening keynote. And that 24% fewer people showed up to breakfast on Wednesday than on Tuesday. And 16% fewer on Thursday than Wednesday. Ah, big data. :-)

Anyway, let’s get back on track. The NFC tag in Google Cardboard is intended to launch an appropriate app on your (Android) device when its back comes into contact with the tag. This usually happens when you either insert the phone into the cardboard holder or you close the back flap.

These NFC tags are typically re-writable, which gives some interesting possibilities for application developers: you can very easily customize the behaviour of the DODOcase’s NFC tag, having it launch your own app or web-page when it comes in contact with someone’s phone.

The Google Play Store has a number of NFC-related apps, allowing you to read and write NFC tags. The first one I used – when I only wanted to write tags to launch web-pages – was NFC Writer by Tagstand. The tool I recommend now – assuming you want to both read and write all kinds of NFC tags – is NFC Tools.

Here’s what NFC Tools said when I used it against the standard DODOcase tag (I’ve merged a couple of screenshots to get all the data on one view):

NFC Tools reading our tag

The main items of interest are way at the bottom in Records 0 and 1. These contain the information that causes – when the tag touches the back of the phone when it isn’t in tag reading mode – Android to bring up the Google Play Store with the option to install the DODOcase app:

DODOcase app on the Play Store

If the app is installed already, it’ll simply launch, of course.


DODOcase VR Store

So how can we get another app to load, instead of this one? Let’s take a look at how NFC Tools can be used to have a specific web-page or app launched when it touches your device.

We’ll start with a web-page for which we need a URL record placed on the tag. Here’s how to have http://autode.sk/gcbv launch when the tag touches your device:


Using NFC Tools to write a custom URL tag

The problem I had with this approach on my phone was that it would launch the page repeatedly, presumably when the contact between the phone and the tag was interrupted for some reason. This apparently doesn’t happen with all devices, but it was certainly enough for me to give up on NFC when I first started working on web-based samples.

On creating a native Android app, however, NFC became more interesting again, because even if a re-launch occurs, it won’t create a new instance of the app. Here’s how you can write an AAR – an Android Application Record – to your NFC tag, causing the com.autodesk.a360cardboard app (specified in the manifest shown in the previous post) to either get installed or launched when the tag is detected. (Bear in mind that this app isn’t actually in the Google Play Store so it won’t attempt to install it.)


Using NFC Tools to write a custom AAR tag

When the tag comes into contact with your device, your custom app will launch:


NFC in action

That's it for this quick look at how you can customize NFC tags for use with Google Cardboard.

January 28, 2015

Integrating web-based VR samples into a native Google Cardboard app using the Cardboard SDK for Android

I mentioned this in last Friday’s post: after building an Android app to bring our web-based VR samples to Gear VR, it made sense to do the same for Google Cardboard. It made sense for 3 reasons:

  1. Most importantly, I wanted to see what the additional capabilities of the Android SDK would bring to the web-based VR samples, particularly around the magnetic trigger button.
  2. Until the Note 4 gets its Lollipop update in “early 2015” – and WebViews support WebGL – there isn’t much more to do with Gear VR. I’ve completed the plumbing but am waiting for the toilet to arrive. OK, bad analogy. :-) My Nexus 4, on the other hand, is running Android Lollipop, so at least that’s one way to see how the web samples work when loaded inside a WebView.
  3. The supported development environment for Google Cardboard, these days, is Android Studio. After wrestling with Eclipse to get my Gear VR built using the Oculus Mobile SDK, I was keen to give Android Studio a try.

The Cardboard SDK for Android is really easy to include in your Android Studio project. I started by cloning the primary sample from GitHub and imported that into Android Studio. Once I had it working on my device, I created a project from scratch, added the libs and copied across big chunks of the main activity.

Android Studio - a breath of fresh air after Eclipse

As we’re doing the stereo rendering of the model via an embedded web-page, we’re primarily using the Cardboard SDK for information on when the magnetic trigger on the side of the device gets pulled (something that we couldn’t get from HTML).

Google Cardboard and its single button

It would have been great to have had information regarding the speed or duration of the pull: you really only get told that it was pulled. But that’s fair enough… in the below Java file we make do with what we have by implementing some basic “double-click” logic:

Beyond that we need to make sure the AndroidManifest.xml had the appropriate entries…

<?xmlversion="1.0"encoding="utf-8"?>

<manifestxmlns:android="http://schemas.android.com/apk/res/android"

    package="com.autodesk.a360cardboard">

 

  <uses-permissionandroid:name="android.permission.CAMERA"/>

  <uses-permissionandroid:name="android.permission.NFC"/>

  <uses-permissionandroid:name="android.permission.VIBRATE"/>

  <uses-permissionandroid:name="android.permission.INTERNET"/>

 

  <application

      android:allowBackup="true"

      android:icon="@drawable/ic_launcher"

      android:label="@string/app_name"

      android:theme="@style/AppTheme">

    <activity

        android:name=".MainActivity"

        android:label="@string/app_name"

        android:launchMode="singleTask"

        android:screenOrientation="landscape"

        android:configChanges="orientation|keyboardHidden|keyboard">

      <intent-filter>

        <actionandroid:name="android.intent.action.MAIN" />

        <categoryandroid:name="android.intent.category.LAUNCHER"/>

        <category

          android:name="com.google.intent.category.CARDBOARD"/>

      </intent-filter>

      <intent-filter>

        <actionandroid:name="android.nfc.action.NDEF_DISCOVERED"/>

        <categoryandroid:name="android.intent.category.DEFAULT"/>

        <data

          android:mimeType="application/com.autodesk.a360cardboard"/>

      </intent-filter>

    </activity>

  </application>

</manifest>

The interface itself is fairly rudimentary: it uses the same HTML/JavaScript as the Gear VR sample, but advances the selection when a single pull on the magnetic trigger is detected. If a double-pull is detected, the selected model is opened. A trigger-pull from within the model will go back to the main list by reloading the page… to get the selection in the list back to what it was we send the number of “down” clicks we’ve counted since the app was loaded and pass that through. The JavaScript does a modulo remainder to determine the item to select. A little crude, but this avoids us having the JavaScript need to call back into our Java code.


The menu

Overall it works pretty well. The performance of the embedded WebView seems as good as with the web-based samples in Chrome for Android: they’ve done a good job of making sure the container itself doesn’t add overhead. Plus you get the benefits of being properly full-screen – without the need for some user input, as you have in HTML – and the “always on” is managed for you: no need to go and make sure your screen doesn’t turn off after 30 seconds (my Nexus has it set to 30 minutes, these days, which is good for VR but less good for normal usage).

The double-click takes a bit of practice to get right: at the beginning you single-click twice quite a bit, which means you have to loop back round to select the model you wanted (which is irritating). But it’s fairly useable, given the limited input options we have available.

Speaking of input options: I spent some time trying to work out how to enable speech recognition inside a WebView. This *should* be possible with Lollipop, as you can now pre-grant permissions for loaded web-pages as long as the app itself has compatible permissions granted by the user, but I wasn’t able to get it working. This is still at the bleeding edge, so I’m hopeful that will be enabled, at some point.

Next time we’re going to talk a little about NFC, and see how that can be used effectively with Google Cardboard to launch our custom Android app.

photo credit: othree via photopincc

January 26, 2015

Using environment variables inside AutoCAD file path options

Operating System-level environment variables are a handy way to reduce redundancy or to simplify providing support for per-user settings. (I’m sure they’re good for other things, too, but these are the ones that spring to my mind, at least. :-)

One thing I only discovered recently – and thanks to Tekno and Dieter for discussing this – is that you can use environment variables in a number of places in the file path settings accessed via AutoCAD’s OPTIONS command. The topic came up in the specific context of the TRUSTEDPATHS settings, but it seems to have more general support than that.

In this post we’re going to look at how you can test this capability, to see for ourselves how it might be used. For many of you the information in this post will be considered very basic, but it seemed worth covering, nonetheless.

Let’s take a module – available at C:\Program Files\MyCompany\MyProduct\MyApplication.dll – for which we have demand-loading entries set up in the Registry:

Demand loading settings

This module gets loaded as AutoCAD launches, but always brings up the “secure load” dialog:

Secure load request

The right way to avoid this warning is to add the application’s folder – C:\Program Files\MyCompany\MyProduct – to the TRUSTEDPATHS system variable. We could do this explicitly, of course, but let’s see how we might also do this with an environment variable.

Inside a Command Prompt, we’re going to type “set myproddir=C:\Program Files\MyCompany\MyProduct”. We can then test this by echoing – or using dir to check the contents of – the directory referenced in %myproddir%.

Setting an environment variable

This will only be set for processes that are children of this command-prompt, so after this we go to the AutoCAD Program Files folder and launch acad.exe from there. We could, of course, set this as a system-wide setting either via the Control Panel or using setx from the command-line, but for testing purposes we’d actually like to see what happens when the variable isn’t defined.

Once AutoCAD has come up, we can reference the environment variable using %myproddir% in the Trusted Paths setting:

Using the environment variable inside our options

If you close the OPTIONS dialog and reopen it, you’ll see the path has been resolved and displays as the actual location:

Reopening the OPTIONS dialog shows the resolved value

When you relaunch AutoCAD via our Command Prompt, you should no longer see the load warning for our custom module. However, if you launch AutoCAD from the usual shortcut – without having set the variable at the system level – you’ll see %myproddir% in the list as it remains unresolved. Which means we’re storing the name of the variable – rather than the resolved value – and attempting to resolve it on launch.

In case you want to access this system variable from the command-line, you can do so using (getenv):

Command: (getenv "myproddir")

"C:\\Program Files\\MyCompany\\MyProduct"

That’s it for this basic introduction to using environment variables. I’d be curious to hear from people on how they use them in their own applications. Please post a comment, if you have scenarios you’d like to share.

Update:

Thanks to Glenn Ryan for mentioning the possibility to use the REG_EXPAND_SZ type to allow environment variables in Registry keys to be expanded. I tried it with the above demand-loading keys and it works perfectly (assuming you’ve used the Control Panel or setx to give the myproddir variable a broader scope):

Modified demand-loading settings

This is really what you want, to be able to use the environment variable for your application’s path in multiple places whether inside or outside AutoCAD. Thanks, Glenn!

January 23, 2015

HoloLens for CAD?

I’m down with some kind of stomach bug, so any thoughts I might have had of writing code today are out the window. But luckily there’s plenty of juicy technology news buzzing around – especially in the AR space – that’s worth reporting on. Interestingly this isn’t the first time this has happened. I wonder if my sickness-addled brain has a tendency to gravitate towards “out there” technologies such as AR & VR (especially since reading John C. Wright’s The Golden Age trilogy – a must for anyone interested in this domain)?

I spent a fair amount of time working with VR, this week: I took the web-based VR samples and created a native wrapper for the Samsung Gear VR, and yesterday I decided that Google Cardboard deserved similar treatment, so I went and created a simple Android app using the Google Cardboard SDK and much of the same code. More on that next week.

I believe the immersive experience of VR is going to be really important for the entertainment industry and some limited use-cases for design professionals, such as walk- and fly-throughs of a purely virtual space (perhaps in the context of digital cities). It’ll also be useful for other kinds of design review that aren’t anchored to the real world. Of course it could also be the pre-cursor to us all living in shoeboxes, only interacting inside virtualities, but I’m not ready to go there just yet. :-)

In my opinion the “big ticket” item for our industry in the coming years – assuming various tricky problems get solved – is going to be AR. And interesting details on a couple of promising technologies in this space have emerged over the last few days.

The first is from Magic Leap, a secretive start-up that has recently received upwards of half a billion dollars in venture capital from the likes of Google. Oh, and they’ve recently appointed another sci-fi (although a lot of his work is termed as speculative rather science fiction) hero of mine as their Chief Futurist: Neal Stephenson, the author of the incredibly prescient novel Snow Crash.

More details regarding the Magic Leap technology have been inferred by the tech press from a recent 180-page patent filing. Inevitably, most of their focus is on entertainment, i.e. the living room:

Entertainment use case

But lip-service is at least paid to professional activities, which will now be possible in both the office and the living room. ;-)

Architecture use case

All very exciting!

Perhaps surprisingly – although perhaps not, if you happen to believe that Nadella’s Microsoft has turned a corner for whatever reason – the Magic Leap excitement was displaced very quickly on the AR front pages by the Windows 10 (and related) announcements. Yes, Windows 10 will be a free upgrade, and Microsoft is marketing Surface Hub as a huge, interactive workspace (which I have no doubt will be very cool, especially for people who have to collaborate with teams in other sites, as I do), but the main bit of news from the recent announcement was another AR-related technology, Microsoft HoloLens.

HoloLens

So far this looks very, very impressive. I know it’s a marketing video, but this just blows me away:




This video provides some additional, helpful context, in case:




When looking into into the announcement – and I fully admit that I haven’t yet found time to sit through the 2+ hour recorded webcast – I came across this photo in an article:

HoloLens and AutoCAD

I do believe that’s AutoCAD! :-) Now this isn’t a frame from the launch video, so I can only assume they decided to create a more direct link between the modeling environment and the hologram, showing it in 3D in both places. But nonetheless it’s indicative of the way people are thinking about this technology: it definitely has widespread implications for the design industry.

Now I’m not privy to any strategic discussions between Autodesk and Microsoft about HoloLens, but I’m certainly excited about the potential. Windows 10 will apparently ship with HoloLens APIs and capabilities… there’s even been some mention of Oculus and Magic Leap being two possible devices that might be hooked into the underlying Windows Holographic infrastructure.

I’d be very curious to see what can be done with this technology, and also how the battle for AR-mindshare plays out from here.

What do you all think? Excited? Depressed? Too early to say? Post a comment!

HoloLens photo credit: Microsoft Sweden via photopin cc

January 21, 2015

Building a Gear VR app using the Oculus Mobile SDK

Today we’re following on from last week’s post introducing this project where we want to convert the Google Cardboard A360 samples to work in a more integrated manner with the Samsung Gear VR.

The main purpose of the project is to see how we can hook up the existing, web-based samples to take advantage of the Gear VR’s hardware. We definitely don’t want to re-implement any of the visualization stack to be native; if we can use UI hardware events to control the web-based samples in a meaningful way, that’ll be just fine for this proof-of-concept.

It took some work but I was able to get the Oculus software running on the Note 4 inside Gear VR. I had to downgrade the software for it to work, though – and to a version that has some quirks – but it’s definitely usable. And it’s impressive: the Oculus team have done a great job implementing a nausea-free experience. You can view 360-degree videos and panoramic photos, and even watch a movie while sitting in a virtual cinema. Very neat!

The next step, for this project to work, was to build an Android app using the Oculus Mobile SDK. As mentioned last time, until the Note 4 device I’m using runs Android Lollipop, I’m not actually going to be able to effectively load a page containing the A360 viewer – which is WebGL-based – inside a WebView control, but I could at least work on the UI plumbing in the meantime.

It took me about a day of work to get the initial project building properly. I had to install a lot of software, including…

  • JDK – the Java SE Development Kit 8
  • Eclipse Juno (I first tried Indigo, but that didn’t work, and it seems Android Studio isn’t yet supported)
  • ADT Plugin for Eclipse (this is no longer easy to get hold of, now that Android Studio is the supported Android development IDE)
  • Android NDK (we’re not really using native C/C++ code in our sample, but we’re basing our project in a native – rather than Unity-based – sample
  • Android 4.4.2 SDK (API 19)

I then had to create a basic project using the VrTemplate project: there’s a handy script to stamp out versions of this template project with your own company and app name. Again, while this has a native component to it, we’re going to avoid using C/C++ code unless we really have to.

There were quite a few dead-ends I followed to get to this point. Getting an .apk built ended up being quite an achievement: getting it to install and load was another one…

I eventually managed to get the app to launch on the Note 4 via USB debugging, but clearly relying on a USB cable to deploy the app meant it couldn’t be mounted in the Gear VR headset at the same time. I could use this approach to work out how to get input from the Bluetooth gamepad, at least, but it was only going to get me so far.

One important step was to generate a device-specific signature file, but in the end it’s this forum thread that got me to the point where I can effectively deploy and run the app inside the Gear VR headset. I needed to enable wifi debugging for adb – which allowed me to get freedom from the USB cable – but also to make some changes to the application’s manifest to make sure it had the “vr_only” flag set (the docs stated that “vr_dual” was needed for testing as a normal Android app or as a VR app, but in reality “dual” really ended up meaning “vanilla Android only”. Setting “vr_only” meant that launching the app brought it up directly inside the VR environment. Perfect!

Now for a look at what’s now working, and where the app needs to go…

I modified the main activity to include a WebView control which then gets pointed at an updated set of samples for Gear VR. This version of the samples has some cosmetic differences: rather than having a single menu of models to pick from, this version has it duplicated, one for each eye. I’ve also adjusted the colours to make the page less visually jarring when viewed in an immersive environment.

Now for the main point of the post… I was able to hook up the UI events provided by the Oculus Mobile and Android SDKs – whether from the Bluetooth gamepad or from the touchpad on the side of the headset – and wire these into the HTML samples. Basically we call JavaScript functions in the WebView whenever we need to, and this controls the HTML page.

Here’s a quick video of the gamepad controlling the UI, as a “for instance”.




Here’s the current state of the main Java file in the sample, showing how the WebView is created and how the Gear VR hardware is able to communicate with it.

The next step is to enable the use of the controls once viewing a particular model: especially zoom and explode. But at this stage that’s a relatively simple problem to solve: the plumbing should now all be in place for this to happen.

At some point I’d like to look at changing the model-selection page away from HTML, integrating it more tightly with the Oculus menuing system (which is much easier on the eyes… having a menu that stays fixed in 3D space – at least as far as your brain is concerned – is much easier to look at than one that follows your eyes). But that would involve some duplication: right now having the models enumerated only in HTML is much simpler to update.

photo credit: TechStage via photopin cc

January 16, 2015

Modifying the contents of an AutoCAD xref using .NET

An interesting query came into my inbox, last week. A member of one of our internal product teams was looking to programmatically modify the contents of an external reference file. She was using the code in this helpful DevBlog post, but was running into issues. She was using WblockCloneObjects() to copy a block definition across from a separate drawing into a particular xref, but found some strange behaviour.

In this post I’m going to show the steps we ended up following to make this work. We’re going to implement a slightly different scenario, where we modify an external reference to load a linetype and apply that to all the circles in its modelspace.

The application is split into two separate commands: our CRX command (for CReate Xref… yes, I know this name’s a bit confusing) will create an external drawing file – consisting of a circle with a polyline boundary around it – and reference it in the current drawing. There’s nothing very special about this command: we just create an external Database, save it to disk, and then run the standard ATTACH command via ed.Command().

The second command is called CHX (for CHange Xref) and this is the more interesting one: it attempts to edit any external reference found in modelspace, manipulating each one to have all its circles given the “dashed” linetype. If that linetype isn’t loaded it’ll go ahead and load it.

Here are the two commands in action:


Modify external reference



The main “trick” we had to resolve when using WblockCloneObjects() was to set the WorkingDatabase to be that of the xref. This step wasn’t needed for loading linetypes, but I’ve left the code commented out for setting the WorkingDatabase appropriately: you may well need to uncomment it for your own more complex operations.

The other important step was to check whether the xref’s underlying file is accessible for write: if the drawing is open in the editor or is read-only on disk (unlikely in our scenario, as we’re creating it) then the XrefFileLock.LockFile() operation will throw an exception (but also keep a file lock in memory which will throw another exception when eventually finalized… not good). So we need to preempt the exception, avoiding the conditions under which it would be thrown.

We do this using this approach to check whether the file is in use, extended to also detect whether the file is read-only on disk. It can be extended with additional file access exceptions, if the two that I’ve included don’t end up covering the various scenarios.

Here’s the C# code:

using Autodesk.AutoCAD.ApplicationServices;

using Autodesk.AutoCAD.DatabaseServices;

using Autodesk.AutoCAD.Geometry;

using Autodesk.AutoCAD.Runtime;

using System.IO;

using System;

 

namespace XrefManipulation

{

  public class Commands

  {

    const string xloc = @"c:\temp\xref.dwg";

 

    [CommandMethod("CRX")]

    public void CreateXref()

    {

      var doc = Application.DocumentManager.MdiActiveDocument;

      if (doc == null)

        return;

 

      var ed = doc.Editor;

 

      // Create our Database with the default dictionaries and

      // symbol tables

 

      using (var db = new Database(true, true))

      {

        using (var tr = db.TransactionManager.StartTransaction())

        {

          // We'll add entities to its modelspace

 

          var bt =

            (BlockTable)tr.GetObject(

              db.BlockTableId, OpenMode.ForRead

            );

          var ms =

            (BlockTableRecord)tr.GetObject(

              bt[BlockTableRecord.ModelSpace], OpenMode.ForWrite

            );

 

          // Create a Circle inside a Polyline boundary

 

          var c = new Circle(Point3d.Origin, Vector3d.ZAxis, 1.0);

 

          ms.AppendEntity(c);

          tr.AddNewlyCreatedDBObject(c, true);

 

          var p = new Polyline(4);

          p.AddVertexAt(0, new Point2d(-1, -1), 0, 0, 0);

          p.AddVertexAt(1, new Point2d(-1, 1), 0, 0, 0);

          p.AddVertexAt(2, new Point2d(1, 1), 0, 0, 0);

          p.AddVertexAt(3, new Point2d(1, -1), 0, 0, 0);

          p.Closed = true;

 

          ms.AppendEntity(p);

          tr.AddNewlyCreatedDBObject(p, true);

 

          tr.Commit();

        }

 

        // We're going to save our file in the specified location,

        // after erasing the file if it already exists

 

        if (File.Exists(xloc))

        {

          try

          {

            File.Delete(xloc);

          }

          catch (System.Exception ex)

          {

            if (

              ex is IOException || ex is UnauthorizedAccessException

            )

            {

              ed.WriteMessage(

                "\nUnable to erase existing reference file. " +

                "It may be open in the editor or read-only."

              );

              return;

            }

            throw;

          }

        }

        db.SaveAs(xloc, DwgVersion.Current);

      }

 

      // The simplest way to attach the xref is via a command

 

      ed.Command("_.-ATTACH", xloc, "_A", "25,12.5,0", 5, 5, 0);

    }

 

    [CommandMethod("CHX")]

    public void ChangeXref()

    {

      var doc = Application.DocumentManager.MdiActiveDocument;

      if (doc == null)

        return;

 

      var ed = doc.Editor;

      var db = doc.Database;

 

      // Get the database associated with each xref in the

      // drawing and change all of its circles to be dashed

 

      using (var tr = db.TransactionManager.StartTransaction())

      {

        var bt =

          (BlockTable)tr.GetObject(

            db.BlockTableId, OpenMode.ForRead

          );

        var ms =

          (BlockTableRecord)tr.GetObject(

            bt[BlockTableRecord.ModelSpace], OpenMode.ForRead

          );

 

        // Loop through the contents of the modelspace

 

        foreach (var id in ms)

        {

          // We only care about BlockReferences

 

          var br =

            tr.GetObject(id, OpenMode.ForRead) as BlockReference;

          if (br != null)

          {

            // Check whether the associated BlockTableRecord is

            // an external reference

 

            var bd =

              (BlockTableRecord)tr.GetObject(

                br.BlockTableRecord, OpenMode.ForRead

              );

            if (bd.IsFromExternalReference)

            {

              // If so, get its Database and call the function

              // to change the linetype of its Circles

 

              var xdb = bd.GetXrefDatabase(false);

              if (xdb != null)

              {

                // Lock the xref database

 

                if (

                  IsFileLockedOrReadOnly(new FileInfo(xdb.Filename))

                )

                {

                  ed.WriteMessage(

                    "\nUnable to modify the external reference. " +

                    "It may be open in the editor or read-only."

                  );

                }

                else

                {

                  using (

                    var xf = XrefFileLock.LockFile(xdb.XrefBlockId)

                  )

                  {

                    // Make sure the original symbols are loaded

 

                    xdb.RestoreOriginalXrefSymbols();

 

                    // Depending on the operation you're performing,

                    // you may need to set the WorkingDatabase to

                    // be that of the Xref

 

                    //HostApplicationServices.WorkingDatabase = xdb;

 

                    ChangeEntityLinetype(

                      xdb, typeof(Circle), "DASHED"

                    );

 

                    // And then set things back, afterwards

 

                    //HostApplicationServices.WorkingDatabase = db;

 

                    xdb.RestoreForwardingXrefSymbols();

                  }

                }

              }

            }

          }

        }

        tr.Commit();

      }

    }

 

    internal virtual bool IsFileLockedOrReadOnly(FileInfo fi)

    {

      FileStream fs = null;

      try

      {

        fs =

          fi.Open(

            FileMode.Open, FileAccess.ReadWrite, FileShare.None

          );

      }

      catch (System.Exception ex)

      {

        if (ex is IOException || ex is UnauthorizedAccessException)

        {

          return true;

        }

        throw;

      }

      finally

      {

        if (fs != null)

          fs.Close();

      }

 

      // File is accessible

 

      return false;

    }

 

    // Change all the entities of the specified type in a Database

    // to the specified linetype

 

    private void ChangeEntityLinetype(

      Database db, System.Type t, string ltname

    )

    {

      using (

        var tr = db.TransactionManager.StartTransaction()

      )

      {

        // Check whether the specified linetype is already in the

        // specified Database

 

        var lt =

          (LinetypeTable)tr.GetObject(

            db.LinetypeTableId, OpenMode.ForRead

          );

        if (!lt.Has(ltname))

        {

          // If not, load it from acad.lin

 

          string ltpath =

            HostApplicationServices.Current.FindFile(

              "acad.lin", db, FindFileHint.Default

            );

          db.LoadLineTypeFile(ltname, ltpath);

        }

 

        // Now go through and look for entities of the specified

        // class, changing each of their linetypes

 

        var bt =

          (BlockTable)tr.GetObject(

            db.BlockTableId, OpenMode.ForRead

          );

        var ms =

          (BlockTableRecord)tr.GetObject(

            bt[BlockTableRecord.ModelSpace],

            OpenMode.ForRead

          );

 

        // Loop through the modelspace

 

        foreach (var id in ms)

        {

          // Get each entity

 

          var ent = tr.GetObject(id, OpenMode.ForRead) as Entity;

          if (ent != null)

          {

            // Check its type against the one specified

 

            var et = ent.GetType();

            if (et == t || et.IsSubclassOf(t))

            {

              // In the case of a match, change the linetype

 

              ent.UpgradeOpen();

              ent.Linetype = ltname;

            }

          }

        }

        tr.Commit();

      }

    }

  }

}

This technique is really useful: I can imagine lots of scenarios where in-place xref modification could be used to develop a valuable application feature. If you agree and have a particular scenario you think is worth sharing – or perhaps would like to see some sample code developed for – please post a comment!

January 15, 2015

A360 viewer & Samsung Gear VR

Back in October, I had a lot of fun developing some Virtual Reality samples using Autodesk’s View & Data API. The samples instanced the A360 viewer component twice in a web-page and controlled the views on these two instances – maintaining a stereoscopic effect – as the mobile viewing device changed orientation.

Here’s a video we saw in a previous post to give a sense for how these demos work…




The original samples were developed for Google Cardboard – which many of you will have received at the recent DevDays events around the world – but they’re just as applicable for other mobile VR platforms.

One such platform that’s been getting a lot of interest over the last couple of months is the Samsung Gear VR. This is similar in concept to Google Cardboard, but goes way above and beyond in terms of comfort and visual quality. Gear VR currently works with only one model of smartphone, the Galaxy Note 4, although you can get a sense for the quality of the optics by holding other devices in place.

The ADN team has bought at least one of these to play with. Philippe Leefsma had one shipped across to the Neuchâtel office, but then went on vacation for the best part of January. I hate to see toys going unplayed with, so with Philippe’s permission I cracked the box and got stuck in. :-)

Geared up

It’s very important to note that the samples we’ve developed, so far, are for web-based VR: they make use of Three.js and WebGL via the View & Data API’s viewing stack to create a VR experience in the browser. This has certain benefits in terms of portability, of course – the samples work well on certain iPhones, although interestingly seem to have issues we so far haven’t been able to resolve on the iPhone 6 – but there are downsides in terms of native device integration: for Google Cardboard this only means you don’t get access to the information from the magnet-based switch on the side of the headset (as that’s only available in the Android-native Google Cardboard SDK) but for Gear VR the impact is more pronounced.

Gear VR has a micro-USB plug that connects the headset with the Note 4 when you mount it. This allows VR applications running on the Note 4 to use the Oculus Mobile SDK (yes, the Gear VR software was developed in conjunction with Oculus) to gain access to some additional capabilities. Rather than a simple toggle switch, Gear VR has volume & back buttons as well as a separate gamepad with which it connects via Bluetooth. There’s also a proximity sensor so that the device can turn off when the headset is taken off (at least that’s what I think it does).

I’d like to use to the SDK to add some additional capabilities to the web-based samples, at some point, but right now there are a few things getting in the way. Firstly, at the time of writing the Note 4 is still running Android KitKat (4.4.4), while the Gear VR software stack is all based on Lollipop (5.x). [It turns out this isn’t true: it’s just that not all Note 4 devices have a compatible chipset & firmware. More on this at the end of the post.] So even plugging the phone into the Gear VR device tells me I need to install the software… and then trying to do so results in this message.


Coming soon


Secondly, Android’s WebView component apparently doesn’t support WebGL in versions prior to Lollipop, either. This wouldn’t necessarily be an issue if we were using a native viewer component – which would no doubt give better performance/higher framerates – but doesn’t make sense for a prototype implementation. Better to take advantage of the existing web-based viewer, building a native wrapper around it, for now.

So really we’re stuck with trying out the existing web-based samples with Gear VR. These actually work surprisingly well, although I had to jump through some hoops to get them working.

The initial hoop related to WebGL support. After finding out that the samples didn’t load – and then using USB debugging to determine that WebGL wasn’t being found – I worked out that I could check on the WebGL status of the device by using chrome://gpu and then override the setting by using chrome://flags, as per the below animation (I think it’s only missing a “re-open the browser” step):


Chrome flags


The second hoop was more physical: I had to mount the Note 4 in the device without inserting the micro-USB plug. This turned out to be easy enough to do: it seems to be designed for the plug to flap up and allow the phone to be mounted – but remain disconnected – underneath.

One final hack – rather than a hoop, in this case – was to tape across the speaker on the back. With the Nexus the microphone beep is controlled by the system volume, but this isn’t the case with Samsung devices: and – as you will know if you’ve played the above video – the speech recognition sub-system does cause a beep at regular intervals. A physical block seems the only way to address this, right now.

In terms of the overall experience… the visual quality and field of view were excellent: you can also adjust the focal length, which is great for people who wear glasses. The fit is comfortable, and the level of immersion is much higher than with Google Cardboard.

Here’s a photo of my colleague Simon Freihart having fun with the demo:

Simon and the Gear VR

Ultimately, though, for the use-cases that interest me most – i.e. using the technology for design visualization, rather than for immersive games where you may stay for hours at a time – it may end up being somewhat redundant. The flexibility you get with Google Cardboard – the ability to use pretty much any smartphone – does make it very valuable, and the quality is probably “good enough”.

Which leads me to the next big debate. Is the Gear VR experience good enough to make me want to buy a Note 4 as my next smartphone? I’m a big fan of stock Android, à la Nexus line of smartphones & tablets… I really don’t like the Samsung software layer. Maybe the native experience with the Oculus Mobile SDK will blow me away, when I get to try it, but for now I’m not quite feeling compelled to take the plunge and order a Note 4 as my primary device (I suspect that’ll be the Nexus 6, but we’ll see). If I can persuade someone to buy it for me as a secondary device, that’ll be a-whole-nother story, of course. :-)

Update:

It turns out it’s possible to run Gear VR with an KitKat-fuelled Note 4. I may just have to upgrade (or downgrade?) the firmware for it to work. The Over-The-Air upgrade process certainly doesn’t recognise an update as pending… will post another update if/when I get it working.

Update 2:

Interestingly I have (or rather Philippe has ;-) an unlocked Asia-Pacific Note 4 with the Exynos (rather than Snapdragon) processor. This page allowed me to install an older version of the Oculus runtime that supports this CPU. This should keep me going until a properly compatible firmware is available. Many thanks to HomerS66 for helping figure this all out!

January 14, 2015

Last few days to apply for the Autodesk Cloud Accelerator

I mentioned this event – an excellent opportunity to kickstart your web or mobile application development efforts – late last year.

The original submission deadline was last week, but this has been extended to January 17 – the end of this week! If you can put together a proposal (which shouldn’t be more than 1,000 words) by then, you may still be able to participate.

As a reminder, Autodesk is hosting the 2-week workshop in our downtown San Francisco offices from March 9-20, and will pay the hotel costs for 1-2 people per company. Get more information here.

Feed/Share

10 Random Posts