FAQ, Acronym List, & Troubleshooting

FAQ

  • How are SigSimRT and SenSimRT distributed? Is it a form of "blackbox .dll“ with C++ headers, or is there a space for custom code modifications? : SigSimRT and SenSimRT are distributed as a dll based SDK with highly configurable parameterization. The component based interfaces do allow you to intercept the SDK contributions before applying them to the graphics shaders, textures and uniforms.
  • Does SigSimRT prepare the data suitable for the pixel shaders, or does it also perform the rendering?: SigSimRT generates uniforms and textures which will be used in the pixel shaders. The pixel shaders then perform the rendering, and will need to be modified by the user to call JRM signature generation shader functions that are provided as part of the SDK.
  • Does it come with a list of sensor settings for commercially available devices, or typical settings for military devices? : It comes with notional settings for Visible Camera, EO, NVG, MWIR and LWIR cameras. Unfortunately, specific sensor settings would take this from EAR to ITAR.
  • Can you describe the whole process from constructing a scenario (e.g., modelling objects), to sensor modelling, to outputing simulated images? : Typically, raw satellite and/or feature imagery is imported into JRM’s GenesisMC for material classification at the texture level. The scene geometry, textures, and material files are loaded at startup, and SigSim libraries are called to provide material spectral data, compute ephemeris and spectral irradiances, resulting temperature solution parameters, and atmospheric parameters. GPU shaders then put the final at-aperture signature together, and pass this 2D radiance image to SenSim, which adds physics-based, user-defined sensor effects. 
  • How 'close' are simulated IR (MWIR, LWIR, etc.) images to real IR images? Some of the example images on the JRM website contain much less noise than often appears in real IR images. : This is so that we don’t obscure the presentation of other important effects like angle-dependent thermal loading, BRDF reflection calculation, the quality and resolution of our terrain and entity texturing, special effects, and so forth. In general, the amount of sensor noise and blur are governed by physics calculations based on user input of optics, detector, and electronics parameters, such as aperture size and shape, focal length, detector NEDT, 1/f noise exponent and knee frequency, etc. In 2005, JRM’s products were evaluated against non-real-time government codes such as MUSES, SPIRITS, and MODTRAN and found to be accurate to within 2% radiometrically.
  • Can users add materials and material systems to the material library? : Yes.  Currently 305 measured but encryped material files (.MTLE) are shipped, each containing both bulk properties (for thermal calculations) and spectral surface BRDF properties. However, the software will read unencryped material files (.MTL) as well, and this file format is open, so that users may define their own materials. These material files are placed in the same directory as the rest of the materials, and are referred to by name in the definition of material systems, which are layered 1D stacks of raw materials, with user-defined thermal boundary conditions at the topmost/outermost and bottommost/innermost layer. This unique “material system” concept allows for proper modeling of thermal inertia in the presence of irradiance and convective loading. The material system files (each containing the set of material systems applicable to a given texture or set of textures)  can be placed anywhere on disk, and are loaded when the applicable terrain or 3D feature/entity model is loaded.  JRM products ship with default sets of both MTL and MS files.  
  • Is automatic gain control supported? : Yes. We support four AGC models, including Histogram Equalization and Histogram Projection, with user-specified minimum and maximum gain limits.
  • Does JRM simulate Radar images? :  Yes, JRM has a product (“OSV-Radar”) for predicting radar imagery of various formats (SAR, ISAR, MTI, RBGM, etc.), using the same materially-encoded terrain and entity models as for other bands. The material files contain areal RCS parameters for a variety of RF bands. User inputs include area of interest, average power, gain, PRF, pulse width, RF center frequency, and of course the environmental/atmospheric controls.
  • How about the spatial resolution of the Radar? : The spatial resolution is governed by the user-defined input parameters for such quantities as airspeed, pulse width, PRF, and wavelength, with a lower limit at the spatial resolution of the underlying material-encoded terrain texture.
  • Can GenesisMC be used to do material classification for radar as well as EO/IR?  :   Yes, the radar module uses the same materially-classified scene database as for the EO/IR bands. The accuracy of the GenesisMC material classification is governed by : (a) The number and variety of different measured material data against which the input image is spectrally matched. Here JRM has a clear advantage over other material classifiers, as we are able to measure our own materials, and have a stock of over 300 different materials to choose from currently. Other classifiers only classify to the level of broad material category, like rock, water, grass, and soil.  We offer detail down to the level of individual rock, soil, and vegetation species; (b) The spectral resolution (number of channels) of the input imagery. Again, JRM has an advantage in that our classifier can take advantage of an unlimited number of spectral bands; and (c) The availability of shapefile data, which is an industry-standard parameterized format allowing for pre-regioning of an area-of-interest to specifically denote the location of road networks and building footprints. JRM is currently implementing support for shapefile import.
  • Does SigSimRT consider the effect of thermal and atmospheric conditions? : Absolutely. In addition to providing a full Modtran/Radtran interface, JRM’s products contain ephemeris models for predicting apparent solar/lunar/stellar positions in the sky, an irradiance model for predicting the spectral direct and diffuse irradiance on surfaces from these sources, through the defined atmosphere, and a fully-transient finite-difference thermal solver for predicting temperature based on this irradiance. In fact, JRM’s unique parameterization of the thermal solution feeds GPU shaders with coefficients used in an on-the-fly compilation of angle-dependent solar loading based on the specific surface normals encountered during rendering. 

 
 
Acronym List

  • 6DOF Six Degree Of Freedom
  • AFRL Air Force Research Laboratory
  • AoA Angle of Arrival
  • API Application Programming Interface
  • ARL Army Research Laboratory
  • BIS Bureau of Industry and Security – US Dept. of Commerce
  • BRDF Bidirectional Reflectance Distribution Function
  • CAD Computer Aided Design
  • CCL Commerce Controlled List
  • CDRL Contract Data Requirements List
  • CPU Central Processing Unit
  • CSG Constructive Solid Geometry
  • D/A Digital to Analog
  • DBF Digital Beam Forming
  • DF Direction-Finding
  • DRFM Digital RadioFrequency Memory
  • DMD Digital Micromirror Device
  • EAR Export Administration Regulations
  • ECCN Export Control Commodity Number 
  • EM Electro-Magnetic
  • EO Electro-Optic
  • FLIR Forward-Looking Infrared
  • FFT Fast Fourier Transform
  • FM Frequency Modulation
  • FOV Field Of View
  • GFE Government-Furnished Equipment
  • GFI Government-Furnished Information
  • GHz GigaHertz
  • GIS Geographic Information System
  • GPU Graphics Processing Unit
  • GUI Graphical User Interface
  • GOTS Government Off-The-Shelf
  • ID Identifier
  • IF Intermediate Frequency
  • IFF Identification of Friend or Foe
  • IFFT Inverse Fast Fourier Transform Operation
  • IR InfraRed
  • IRSP InfraRed Scene Projector
  • IRMA Infrared Millimeter Wave Analysis
  • ISO International Standards Organization
  • IT Information Technology
  • ITAR International Traffic in Arms Regulations
  • I/O Input/Output
  • I/Q In-Phase and Quadrature-Phase
  • JRM JRM Technologies, Inc.
  • kW KiloWatt
  • LT2 Live Training Transformation
  • LOS Line-Of-Sight
  • LWIR Long-Wave Infrared
  • MCITS Multi-spectrum Combat Identification Target Silhouette
  • MHz MegaHertz
  • MLA Manufacturing License Agreement
  • MCM Material-Classified Map
  • MTL Material
  • MS Material System 
  • ms MilliSecond
  • MTI Moving Target Indicator
  • MTS Moving Target Simulator
  • μm Micrometer (micron) (unit of length)
  • μs MicroSecond
  • MWIR Mid-Wave Infrared
  • NIR Near-Infra-Red
  • NVESD Night Vision Electronic Sensors Directorate
  • NVG Night-Vision Goggle
  • OOI Object Of Interest
  • OPSEC Operations Security
  • OSV OpenScenegraphViewer (JRM scene simulation product)
  • OTF On The Fly
  • PC Personal Computer
  • POR Program of Record
  • PRF Pulse Repetition Frequency
  • PW Pulse Width
  • RCS Radar Cross Section
  • RF RadioFrequency
  • SAR Synthetic Aperture Radar
  • SBIR Small Business Innovation Research
  • SC Scattering Center
  • SOA Service-Oriented Architecture
  • SOW Statement Of Work
  • SPIE The International Society for Optical Engineering
  • STE Synthetic Training Environment
  • STP Standard Temperature and Pressure
  • SUT System Under Test
  • TAA Technical Assistance Agreement
  • TAP Transition Assistance Program
  • TPOC Technical Point-Of-Contact
  • UD Ulaby-Dobson
  • USCS Universal Scatter Center Set
  • USML United States Munitions List
  • VPN Virtual Private Network
  • W Watt (unit of power)
  • WAS Wide Area Scan

 
Troubleshooting Graphics via the NVidia Control Panel

  • Ensure that you have a late model NVidia card, that your drivers for this card are up-to-date, and that the NVidia card is actually being used (as opposed to motherboard graphics) for the software you are trying to run.   See the "Preferred Graphics Processor" setting on the "Manage 3D Settings" page of the NVidia Control Panel.
  • (Windows: Right-click on desktop and select the control panel.  Linux: Switch to root, then run “nvidia-settings”)
  • Virtual Reality Pre-rendered Frames should be set to 1.
  • Anti-aliasing should be set to “Application-controlled ” or “Off”. 
  • Anisotropic filtering should be set to “Application-controlled”. 
  • Multi-display/mixed-GPU acceleration should be set to “Single display performance mode”. 
  • Threaded optimization should be set to “Off”. If enabled, the driver will consume 100% of one of the processing cores, as well as cause the frame-to-frame timing to wander significantly. 
  • Triple buffering should generally be set to “Off”.
  • Vertical sync should be set to “Force on” for real-time simulation and other applications that require a fixed frame-rate. 
  • Enable Resolutions not Exposed by the Display should be set to “On” under the “Resolution” tab.

Troubleshooting Performance

  • Generation of physics cache files can take some time, but is unavoidable, as its intent is to speed up processing when re-visiting previously assigned dates, times, and environmental conditions.
  • jrmData/atm/cache contains diurnal full-band irradiance cache files, atmospheric parameters, and TOD-specific atmospheric/irradiance caches.  Of these, the full-band irradiance cache files (*.dc) take the longest time to generate.  The TOD-specific caches take the least amount of time.
  • jrmData/cache contains diurnal thermal cache files corresponding to specific material system sets. These are fairly quick to generate, unless very thin metallic material systems are used.
  • Any new atmospheric conditions (ATM Model, Haze Model, Wind Speed, Air Temperature Limits, Humidity), or a date change outside of 1 week in either direction, will trigger a re-spin of all cache files.  Time-of-day changes will not require new thermal cache re-spins, but will generate new TOD-specific atmospheric/irradiance tables (unless this TOD has already been "visited"). 
  • After altering material system (MS) files, manual clearing of the jrmData/cache directory is required, as the presence of a cache file for a given MS file indicates that thermal spinup on that file is not needed, so edits will not otherwise be processed.

 
Troubleshooting Signatures

  • Turn off sensor effects to see if the issue is pre-aperture or post-aperture in nature.
  • Change from daytime to night-time to see if the issue is solar / reflective in nature.
  • Clear your cache directories (jrmData/cache and jrmData/atm/cache).  Then re-run.
  • Review log files to check for error messages regarding missing geometry, textures, MS files, MTL files, or bad formatting of such files.  If necessary, one can set "SIGSIM_DEBUG = 1" in your environment variables to obtain more output, and you can send this output to disk by starting the program from a command line, e.g. "GuiViewer ../config/jrmSensorSettings.xml  2>&1 >run.log".
  • Choose a surface in your scene which you can use temporarily as a blackbody test.  Alter its MS file to associate it with a Blackbody material layer of known, fixed temperature (BC1=0, T1=User-specified temperature). Clear your jrmData/cache directory and re-run.  Then snap a short-range pre-aperture FPF of this surface to disk and check its radiance against a corresponding Planck integral over the waveband of interest, keeping in mind that the Planck integral will need to be divided by PI to convert to radiance [W/cm2/sr]. JRM has a suite of blackbody test cubes which can be introduced into your scene for this purpose.

Be sure to calibrate your expectations: 

  • Most target-detection trials are done in MWIR, which is best for showing contrast between hot thermally-active regions and surrounding background. LWIR, on the other hand, is best for picking out contrast among various features of the natural background.
  • Infrared cameras are primarily used at night or in foggy/obscured conditions, not during the day, when the much higher spatial resolution of visible-band cameras is advantageous. 
  • Don't try using simulated NVG goggles during the day!
  • Overall image brightness depends on the dynamic range of the sensor.  For reflectance band sensors (e.g. VIS, NVG, or anything under 3um), the dynamic range is set by the max light level.  For thermal band sensors (MWIR, LWIR) it is set by the min/max apparent temperatures.
  • Visible and near-infrared band overall scene brightness is particularly sensitive to time of day and date.  The elevation of the moon in the sky, as well as its phase, can require very different max light level settings in order to yield a well-balanced image.   
  • Bare metal surfaces (no paint) will likely be dark in infrared regardless of their temperatures, because they have high reflectivity (low emissivity). 
  • Surfaces of high specularity (e.g. water surfaces at low wind speed) can also be dark, as the emissivity (just like the reflectivity) may be highly angle-dependent in this case.  
  • The appearance of local volumetric effects such as dust or smoke clouds depends not only on the PIP file settings, but also the density of particles, which is often not under our control, but rather under the control of the parent rendering engine.  Some of these obscurant types can also be highly reflective, not just absorptive.