Archive for July, 2013

Gathering Data Protector Logs for HP

What follows is the recommended information and logs one should gather when opening a Data Protector case with HP.  Note: the following only applies to a Data Protector 7.01 Windows Environment.

  1. Server Specs 
  2. Data Protector Patch Level
  3. Full Session Report
  4. Extended logs

1) Server Specs

No matter the issue you are having, invariably the server specs for the Cell Manager need to be provided.  This may also apply to your Installation Servers, Backup Host, or any other affected hosts by the issue you are having.  Generally the OS/Build is acceptable, such as: Windows 2008 R2 Enterprise x64.

2) Data Protector Patch Level

As with providing server specs, the patch level of all related systems will usually be requested as well.  Run this command from every host that has the DP inet agent installed:

omnicheck -patches

Copy/Paste the output into the support case for each system related to the issue.

3) Full Session Report

More than likely you will want to include the full session report from the backup job that had issues.  This session report will contain a detailed output of the backup job containing all of the related messages you would see in the DP GUI.  You can find the Session ID by browsing the Reporting context in DP GUI, and selecting your job.  The right pane will look something like this:

[138:742] Backup session "2013/07/22-3" of the backup specification "VEAgent VM-Desktops",backup group "VMWARE" has errors: 19.

In this example, 2013/07/22-3 is the Session ID. Copy/Paste that value into the following cmd, which you will execute on the Cell Manager:

omnidb -session <sessionid> -report > C:\session.txt

Upload the session.txt file to HP Support case.

4) Extended logs

In many cases, the default session report will not provide enough debug information to find what is actually the problem.  What will be needed is some extended logging.  However, to gather this additional debug info, you will have to first enable the debugging, re-produce the issue, and then disable the debugging.  The debug process will typically create dozens of log files, depending on the issue, which you can then zip up and upload to the HP Support case.  

There are two primary debug use cases:

  1. Troubleshooting Data Protector GUI
  2. Troubleshooting DP Backup Jobs that are failing/etc.

Troubleshooting DP GUI

Exit the GUI, and restart it from the MS-DOS prompt in debug mode:

cd \Program Files\Omniback\bin
manager.exe –debug 1-500 yourname.txt

Reproduce the error in the DP GUI and then exit the GUI to stop the debugging.  Depending on the nature of the issue, this will create a debug.txt file on every host related to the issue.  On each host, in the following location, look for long file names starting with OB2DBG and ending with yourname.txt:

(Windows 2003) Program Files\Omniback\tmp        
(Windows 2008) Program Data\Omniback\tmp

Gather all these debug files from each host, zip them up (per host), and upload to HP Support Case.

Troubleshooting Failing Backup Jobs

Depending on the nature of your failing backup job you might deviate from these steps, but this will generally apply to all failing backup jobs:

On the Cell Manager open the CMD prompt and execute the following:

omnisv stop / omnisv start -debug 1-500 yourname.txt

On any other related system(s) with a DP INET agent installed, go to Windows services, stop the Data Protector INET service.  In the start up parameters for the service add the following and then click start:

-debug 1-500 yourname.txt

Reproduce the backup job failure, and exit the DP GUI.  This will create dozens of logs on the Cell Manager and every related system debugging was enabled on.  Before you collect the logs however, be sure to disable the debugging by removing the service startup parameters you added to each system and restart them normally.  Also, don't forget to restart DP Cell Manager without the debug:

omnisv stop / omnisv start

On each host, in the following location, look for long file names starting with OB2DBG and ending with yourname.txt:

(Windows 2003) Program Files\Omniback\tmp        
(Windows 2008) Program Data\Omniback\tmp

Gather all these debug files from each host, zip them up (per host), and upload to HP Support Case.

Finally, you will also want to grab a copy of the cell_info file from the Cell Manager and upload it as well:

(Windows 2003) Program Files\Omniback\Config\Server\cell    
(Windows 2008) ProgramData\Omniback\Config\Server\cell 

SharePoint Workflow Emails Using HTML

I've been bashing my head against the wall trying to understand why custom HTML used in a SharePoint Workflow "Send an Email" Action looked so awful when delivered.  In short, there seemed to be many more carriage returns, or new lines, than there should have been.  I was also having issues getting table background colors to render, and forcing externally referenced images to show.  

Inserting Custom HTML into Workflow Emails

Well, the good and bad news was that the answer was right under my nose the entire time.  Here's how I tracked down the last 2 issues mentioned above.  To add a background color to a table, insert an image, or implement any other custom HTML, you need to insert the raw HTML into the Advanced Properties window of the Send An Email action (reference).  Simply highlight the Email Action in your workflow and from the Top Ribbon select Advanced Properties > Body > and "…".  From there, you can insert HTML directly into the body of your email.  And it works perfectly, unless you are forgetting one simple thing.

"Optimizing" Workflow Emails 

SharePoint likes to handle HTML very "differently" then most applications.  That goes for all other Microsoft products frankly, and HTML in Workflow emails is no exception.  The key point to remember is that Workflow Emails containing HTML are expecting everything to be in one long string (reference).  No line breaks or carriage returns between any of your tags can be present.  Technically, this does shave a few hundred bytes (or maybe even a KB) off the document size, and could thus be considered "optimized."  However, the main issue here is that we are already jumping through so many hoops when working with the SPD String Builder (i.e. the Body editor of an email):  Copy/Paste is not possible, for instance, unless all "Lookups" have been removed.  By forcing HTML to be one long string we now introduce another issue where the HTML becomes nearly unreadable to human eyes should troubleshooting or future modifications be required.

Just to verify that indeed SharePoint expects HTML to be one long string, we can do the following.  Create a new "Send an Email" action and include some generic text, without using html.  Something like this:

If we then use the steps above to access the Advanced Properties of the Send an Email action, we will see exactly what has occurred behind the scenes:

Thus, we need to follow suit if we wish for our emails to render even remotely close to what our markup dictates. Seriously though, I assumed the DTD for SPD 2010 was HTML 4.0, but HTML 3.2??? OMG! These days, most modern Email Clients will accept at least XHTML 4.01 Transitional, and that is the markup DTD that I am now using for all my SPD 2010 Workflow emails.

Final Note on SPD 2010 "Optimization"

Back in the days of SPD 2007, it was really easy to Optimize HTML for pages that were not connected to a SharePoint site.  You could just go File > New > HTML Page, paste in your HTML, and Optimize it.  This really elevated SPD from being strictly a SharePoint tool to a handy Webmaster tool overall.  The Optimize functionality still exists in SPD 2010 – its buried in the menu Ribbon under Spelling > Optimize HTML > Remove Whitespace > HTML all whitespace.  However, by the nature of SPD 2010, this feature can only be used on a page that "lives" in a SharePoint site.  So in theory you could create a document library that contains workflow HTML templates, in a human readable form, and whenever you need to Copy/Paste the HTML into a workflow, just temporarily run the Optimize HTML function against your page, Copy/Paste the "Optimized" HTML into a workflow, and close the HTML page in SPD without saving it.  While this isn't a terrible idea, it does seem counter-intuitive and clunky in some ways, but maybe it's just me being difficult.

Exporting Google Earth Models into 3DS Max using 3D Ripper DX

WARNING: 3DRipperDX has been reported by several sites as containing a virus/malware. Wether it is a false positive or a real threat, I am not sure. Proceed with caution.

In my last post, I described how to export geometry from Google Earth into an OBJ file, using GLIntercept.  And despite the lengthy setup instructions, that is a solid approach to capturing geometry from OpenGL applications.  However, there is no direct way to export the geometry with the texture data/coordinates mapped automatically.  For my purposes, the goal remains to export a city from Google Earth into a modeling application.  And while Google Earth does NOT have stellar textures by any stretch, it would be nice to have these included in the exported model.

3D Ripper DX (link removed, see warning above) seems to do exactly that, and the setup time is incredibly faster.  In just a few minutes I was able to follow this video tutorial to capture, export, and import an entire neighborhood, with textures, into 3D Studio Max.

 

As you can see the textures aren't really that great, but it is perhaps better than none at all!  Also, based on the instructions given in the video tutorial, I might be able to capture crisper textures if I play around with it a bit more.  The only other downside (for me) is not knowing how to use 3D Studio Max, at all, but that is obviously a personal issue.

Up next?  Exporting this bad boy into UDK.  🙂

Here is another video that you may also find useful by Paul Fatkins:

See these updated posts for more info:

Part 1 – Exporting Google Earth Models into UDK

Part 2 – Exporting Google Earth Models into UDK

 

Google Earth to OBJ Using GLIntercept

In a recent post, I touched on a workflow to export models from Google Earth into UDK.  However, what if you wanted to repeat this process for an entire city?  Manually exporting every building one at a time would be extremely tedious and a complete waste of time since the models in Google Earth are low-res and would have to be re-created (again) at some point anyways, one at a time, for the final UDK product.  No thank you.  In such a scenario what we really need is an entire 3D city exported from Google Earth to serve as a TEMPLATE.  And then later we can then replace each and every building with a high-res model for our final UDK product.  Sound good?  Then let's get started.

For starters, I came across an article that details this exact scenario.  However, being nearly 4 years old (as of 2013) the article is quite dated, has dead links, and is otherwise missing some very critical pointers.  What follows is an updated overview of the same process from that article, with additional notes and steps to follow.

Prerequisites:

I wrote the following some time ago, and no longer can assist with troubleshooting.  The fact is, this technique/software was already 5+ years old when I came across it, and as time goes by, it only becomes older and more unsupported.  I will not be answering any further comments on this thread, as I cannot help and have myself moved on to 3D Ripper DX.  That being said, if you still wish to proceed, take it slow, read everything, and good luck!!!

Prerequisites:

Setup Instructions:

  1. Install Google Earth.  The installer will install GE into Program Files (x86) by default (on modern Windows 7/8 systems).  To mitigate known issues running programs from the x86 directory, after the install I then copied the Google Earth folder to the root of my C drive.  Henceforth, I ran Google Earth from the C:\ root folder only.

     

    Copy Folder: C:\Program Files (x86)\Google\Google Earth
    To: C:\Google Earth
  2. Install GLIntercept.  Likewise, the installer will install this into Program Files (x86) by default (on modern Windows 7/8 systems). Again, to mitigate known issues running programs from the x86 directory, after the install I then copied the GLIntercept folder to my Program Files folder.

     

    Copy Folder: C:\Program Files (x86)\GLIntercept0_5
    To: C:\Program Files\GLIntercept0_5
  3. Extract OGLE plugin into the GLIntercept0_5\Plugins folder and rename the OGLE folder from "ogle-0.3b" to "OGLE".  Your OGLE plugin files should reside in the following folder:

     

    C:\Program Files\GLIntercept0_5\Plugins\OGLE
  4. Copy "OpenGL32.dll" from "C:\Program Files\GLIntercept0_5" into "C:\Google Earth".
  5. Make a copy of the "opengl32.dll" located in your "C:\Windows\SysWOW64" folder and rename it to "opengl32.orig.dll".
  6. Cut "opengl32.orig.dll" from your "C:\Windows\SysWOW64" folder and Paste it into "C:\Google Earth".  You should now have both "OpenGL32.dll" and "opengl32.orig.dll" in your "C:\Google Earth" folder.
  7. Download this zip and extract the "gliConfig.ini" file into "C:\Google Earth".
  8. Open Google Earth (from "C:\Google Earth") and go to Tools > Options and set the Graphics Mode to OpenGL.  All other settings are up to you.  I left mine at default.  Apply changes and close Google Earth.
  9. If you haven't done so already, install your 3D modeling software now.  Most all 3D applications come with a trial that you can utilize for testing purposes.

Capture GoogleEarth Geometry:

  1. Open Google Earth (from "C:\Google Earth"), and navigate to whatever zip code/city you wish to capture.  Keep that folder open so you can see any new files created in it from the following steps.
  2. Setup your camera angle/view so all the buildings you want are in view.  However, before it finishes rendering all the buildings, start your capture.
  3. Press CTRL+SHIFT+F at the same time to capture.  GoogleEarth may freeze up for a few seconds, depending on how many buildings are being rendered, so be patient as it extracts the geometry.

     

    Note: you may need to Middle Mouse Button click inside Google Earth to ensure it is the active window.  Most important though is to ensure Google Earth is still rendering buildings before you start the capture, otherwise GLntercept may not work and you will have to close Google Earth and start over, or try changing your Camera Angle to for Google Earth to render more geometry.

  4. After the capture has started and Google Earth unfreezes, you will now see an "ogle.obj" file in your Google Earth folder, "C:\Google Earth".  Some additonal log files and folders will be created as well, but they are not needed at this point.  Depending on how many buildings you captured, you should expect the "ogle.obj" file to be 10-100MB or more.

Import OBJ Into 3D Application

I will be using Maya to explain the remaining steps, as it is my preferred 3D modeling application.  The steps will be similar for other applications however.

  • Open Maya and go to File > Import.  Navigate to "C:\Google Earth" and select your ogle.obj file and press OK to import.
  • After navigating your camera around a bit, you may be dismayed to only see what looks like the border/menus of Google Earth, and no buildings!

  • Rest assured your building models are there.  They are just EXTREMELY tiny!
  • Zoom into the Origin of your scene and drag select around until you select what appears to be nothing.  You will know you have your buildings selected when your Heads Up Display jumps from 0 to several hundred thousand Verts.  To enable the Heads Up display, go to: Display > Heads Up Display > Poly Count.  You will also see a tiny white blip indicating your selection.
  • Select your Scale Tool, and drag the center yellow box (origin) to the right.  You will see your buildings pop into existence!

  • From here on out, its a simple matter of scaling, rotating, and modifying your model to your specifications, like so:

 

That should be it.  You now have a workflow to export entire cities from Google Earth into your Modeling application of choice.

I neither support nor condone the use of copyrighted models/assets from Google Earth in personal projects without the the express written consent from the original model authors.  My own goal is to utilize an exported Google Earth model purely as reference, to be replaced by my own work.

Final Notes

With regards to mapping Google Earth textures to the buildings in our obj file: You should notice the "CaptureTextureCoords " option in the OGLE settings in the gliConfig.ini file – that should get you the texture coordinates – but linking with the textures I believe has to be done manually. (I did not write OGLE so I don't know for sure – there is nothing in theory preventing it for working, I just thought it was un-implemented). 

The OGLE Plugin for GLIntercept will work on any 32bit application that uses OpenGL.  And while OpenGL is becoming less and less common these days, if you do a bit of digging around, you can still find games and applications that still do.  For instance, anything written using the idTech3 engine uses OpenGL.  By simply copying the gliConfig.ini, OpenGL32.dll, and opengl32.orig.dll files into my Return to Castle Wolfenstein folder, I was able to capture character models with ease.  Pretty cool, right?

 

WARNING: You will only want to test this on a local game server.  If you join a public game server with these files in your game directory, you may be kicked/banned/etc for trying to run a game exploit of some kind.  This happened to me in RTCW. You have been warned!

Credits

A big thanks to Damian Trebilco, the author of GLIntercept, who personally helped me to get this working in 2013.  Seriously, without his help, this would not have been possible.
And secondly, to Daniel Belcher, whose original article inspired me beyond words and got me started along this path to begin with.  Thanks!
 
Return top