0
Under review

Shared System Memory - Nvidia Geforce 670m

Jurgen Fitschen 9 years ago in General updated 9 years ago 16
Hi,

I have noticed that Micromine doesnt use my Video RAM but instead places everything in shared system memory.  Is there a way of getting around this.  I have checked under "display adapters" for the Intel card but there doesn't seem to be one.  Also I have checked through the BIOS and there is no way of selecting only to use my Nvidia card - There are no new BIOS drivers.  I have set all settings in Nvidia control panel to Max Performance.

After all these changes I still see that when I load my models they are places in the system memory (around 12Gb is used) and the 2.7Gb free Video Ram has not been touched.

Any suggestions or ideas would be greatly appreciated.
Hi Jurgen,

I ran into this issue a while back (actually while beta testing some photo software at the same time as the Beta Micromine. I have a Toshiba Laptop with Nvidia Geforce Card as well as the standard intel card (which has less dedicated memory). I went round and round in circles, in the control panel and on-line manuals. I gather the idea of the two adapters is they are supposed to "know" when to activate (eg the Nvidia driver might be used normally used for 3D style video games). However clearly it doesn't automatically run with Mircomine (or the photo software I was testing)

Finally a found a simple way to use the Nvidia card for any program (it works on the Toshiba at least). You just need to change to the Nvidia display adapter when you run a program like Micromine. The simplest way to do this is right click on the Micromine icon on your desktop. Then select the the graphics processor from the pop-up list.



My computer is a few years old so I also needed to update the Nvidia driver to handle Open/GL2, which was easy from the Nvidia website. I haven't been brave enough to set the Nvidia as the default but things work well this way and does make a noticable difference.
Hi Norm,

Thanks for the reply.  Sadly I have also tried this.  From what I can see it has to do with the new Nvidia cards using Turbocache and Optimus.  There is no easy way of turning this functionality off and form what I can see people have had to edit their system registries in the past to bypass the memory sharing.  If anyone knows of an easy way to do this please let me know!

Thanks again!
Hi Jurgen,


If you access the NVIDIA Control Panel from the Desktop you should be able to add those programs that require access to the NVIDA GeForce video card. I have a GeForce GT650M card in my high-level Samsung laptop and I can access the card to run Micromine no problems.



Cheers

Peter
Hi Jurgen,


PS You can then go to 'Tools/Check Graphic Configuration' in Micromine and obtain the following confirmation that the NVIDIA card is being used.

Cheers
Peter


Under review
Hi Jurgen,

It is important to ensure the most powerful graphics card is enabled for Micromine and the advice given already is excellent.

However, please be aware that there is a difference between application RAM and graphics VRAM. The various objects loaded into Micromine will need to be loaded into application RAM. This data is then processed and passed onto the OpenGL graphics for rendering. In most cases it is the OpenGL graphics drivers that make the decisions to load any data into the graphics VRAM for optimal rendering speed. Blockmodels, Wireframes and images will also have some foot print in VRAM.

The usage of VRAM by Micromine will change on a frame by frame basis (you can get upto 60 or more frames per second rendered) and the VRAM also needs to be shared with other applications so its usage will be balanced by Windows and the graphics driver.

So as you saw the bulk of your data is loaded into application RAM and sent to the graphics on an as needed basis. The data in application memory is also required to ensure editing, and data query actions can be processed in a reasonable manner.
Hi all,

Thanks for the feedback and all the advice.

I just cant seem to get my laptop to perform the way I want it to with Micromine. Some of my models just feel "heavy" and I cant spin them or move them with ease in the Vizex display.

My laptop specs are:
Processor:
3rd generation Intel Core i7-3630QM Processor with Intel Turbo Boost Technology 2.0
Clock speed: 2.40 / 3.40 Turbo GHz

System Memory:
32Gb DDR3 1600Mhz

Graphics:
NVIDIA GeForce GTX 670M with CUDA Technology; 3,072 MB dedicated VRAM.

Hard disk:
1.75TB (Solid State Hybrid 8G Drive with 1TB and 8GB NAND flash + 750GB)

Surely these specs are more than good enough to run any size model?  Is it just because I am using Micromine on a laptop that it feel sluggish - when this model is loaded to a workstation with similar specs then it runs fine....


Jurgen,

Please share you graphics settings as detected and used by Micromine. A screenshot of 'Tools/Check Graphic Configuration'  will tell us most of what we need (the information is also present in MMlog140.txt).

Can you also share some information about your models. If wireframes, how many triangles and points? if Block Models, how many blocks and subblocks and what draw style is used?

Your laptop appears quite well spec'ed so it should handle reasonable sized models well.
Hi Jurgen,

The issue is your graphics card, while it is high end for a laptop of that age, it really isn't that good especially if you have a higher resolution screen. You can see that in the benchmarks (http://www.notebookcheck.net/NVIDIA-GeForce-GTX-670M.72197.0.html) that its closest desktop equivalent is the AMD Radeon R7 250 which sells for $90. A modern high end desktop graphics card can cost +$1000. 

In terms of the 3gb vram you have, the only real advantage to that much ram is you can higher resolution textures, I don't think this will help with Micromine.
Hi Scott,

Screenshot of Tools/Check Graphic Configuration:



We don't really work with block models so just wireframes.  Here is some triangle/point information regarding some wireframes:


Some of the wireframes need many points/triangles as they are implicitly modelled solids which have been snapped to the drillhole database as per the clients request - this is another reason why we are hoping that there will be a wireframe simplify function with snapping at some point in the near future.

As mentioned this model loads and displays smoothly using a colleagues workstation in the office.

Workstation Specs:
CPU:
Intel Xeon E5 @ 3.7GHz

System Memory:
16Gb DDR3 1600Mhz

Graphics Card:
NVIDIA Quadro K4000, 3,072 MB dedicated VRAM.

Thanks for taking the time to answer my question.
Hi Dale,

Thanks for the reply.  That was my first thought too.

If it is my graphics card then I would hope that Micromine would expand on their recommended specifications as my laptop checks all the boxes.  Even with my current graphics card as it does fall in the class 1 category.

I would love to know what laptop specs one would require to view the kind of models we build with ease.
Unfortunately the issue is that laptop builders need to make a compromise between battery life, heat and performance. Even the current generation of laptops with their 980m chipsets are at 80% of their desktop cousins. This is why I have a laptop for travelling and a desktop for working with large datasets.
as Dale mentions a laptop will always have lower graphics capabilities than a desktop because of power as dictated by heat and battery requirements. In this regards the m (mobile) series of graphics cards from NVidia are always of reduced power compared to the desktop cousins.

Our recommended specs for Micromine give the general advice to buy a computer that is suitable for the latest 3D games at full resolution with a high frame rate. Any computer store technician will know what this means and the computer would run Micromine well.

If you need to simplify your wireframes I suggest you use Wireframe | Utilities | Simplify.  If a feature is made available in IM to do this it would very likely be calling this function in the background anyway.
Hi Scott,

I completely agree with both you and Dale.

I am just pointing out is that the recommended spec, as stated on the website, indicates that one requires a "High-end NVIDA or ATI, 1 GB or more of graphics RAM".  I know further along in the document it is stated that one should contact Micromine support for further assistance.

What mobile graphic cards would you recommend for the models we are constructing?

The problem with the Wireframes|Utilities|Simplify function is that it removes vertices/points from the wireframe which have been snapped to the drillholes.  So if one was to build the model using the snap function in the IM and then simplify one would lose all the snapped contacts.  This has all been mentioned in a previous post (http://forum.micromine.com/topic/493216-point-snapping/) and from the reply it seems to have been added to your backlog.

So its clear the solution to my graphics problem is to get a workstation.

Thanks for all the replies and info!
Just to add if your laptop has a higher resolution you could turn that down to get a better speed. Some laptops have 2560x1440 resolution if you lower this to 1920x1080 you will get better performance though the image may not look as sharp. 
Jurgen,

The Wireframes|Utilities|Simplify function will only remove triangle vertices if the distance between the vertex and the surrounding plane of triangles is less than the planar tolerance. I would recommend a planar tolerance that is quite small (0.0001) to ensure that the area coplanar triangles is simplified to use less triangles. You can use Wireframe | Clean for the same effect.
Thanks Scott!  I will most certainly give that a try.