Missing Intervals

Wendz 1 year ago in Geology updated by Pedro Nader 12 months ago 21

Does each and every borehole really need have each interval logged to generate an implicit model?

I have lots of drillholes - how do I fix this quickly without going back to Excel?

Have you tried Drillhole | Data | Insert Missing Intervals ?

Yes I have.

I get an error: "Validation errors found in interval or collar file. Use Drillhole Validation to find these errors. Failed to insert missing intervals."

I checked the validation and I do not have any collar errors. I have loads of interval errors though,  including hole not defined, overlapping intervals, hole does not start from zero and the missing intervals.

Must I fix every error before I can use Insert Missing Intervals?


You will need to, as a minimum clean up the overlapping intervals. You should make sure the interval file is sorted by hole ID and depth first just incase this is giving you problems.

If you have overlapping intervals the modeller can't understand where to put the contacts.

Ultimately it's best practice to address any missing intervals in a geology log, even core loss should be logged as such, and non sampled intervals inserted. But as a quick fix inserting missing intervals will handle your "does not start from zero" problems. 

For drillholes with no log or assay at all the IM will ignore these, for intervals that you have inserted you can chose to ignore or exclude in the IM but this should be evaluated on a case by case basis. 

If you need to exclude holes with no log/assay from the IM I usually run a merge from the collar file to interval file, filtering to the drillholes which are not defined. I never do data manipulation in excel as it's too easy to make a mistake and Micromine has the tools you need.

Once you've cleaned up the data I'd advise archiving your old excel file and exporting your clean DHDB from Micromine, otherwise you will have to fix these problems again and again. Plus if you get to public reporting any consultant will raise these errors as a concern. Errors in the data = errors in the model. They will also likely ask why some intervals or drillholes have no log. There could be a good reason like not logging a pre-collar in a deep deposit, or core loss, but it's good to think ahead.

Richard is spot on - you should make sure the files are all sorted. Collar needs to be sorted by hole name; survey to be sorted by hole name and survey depth; geology to be sorted by hole name and from depth.


Good Morning

Micromine 2023 will have a checkbox now in the Insert Missing Intervals function called Insert unassayed drillholes. It works very well for implicit modelling.

When you open the tool, you insert your assays, your collar and mark the checkbox

So, all your unassayed drillholes from collar will be populated



Hi Pedro,

I can see one painful issue with that 2023 form...There is no way to filter the collars. If you have a series of holes in the data that do not have assays (say a set of in-pit drill holes that were not logged for some reason), you would likely want to filter those holes so that they are not included. Sure you could create a filtered subset of the collar file but it would be easier to just "filter the water bores" etc.



Hi Ron,

Thanks for the feedback! I understand, it would be for example blasting drill holes that were not assayed right? So the models could pass throw them. Is that what you mean?




Exactly Pedro, If a hole that passes through the modelled area has no data (assay or geology) and you "insert" a background value into that drill hole, it would adversely impact the model. Being able to filter the data to exclude certain drill holes whilst infilling background "missing Data" where relevant, would be useful, whereas filling a background value into every drill hole could result in inaccuracies. For instance, I use a flag field in my collar table that I use to flag out drill holes I can not validate, or have data issues (missing/corrupted/untrustworthy data), I can filter this data using the "Flag Not Equal to 1"   DH / Table filter but I could not use that filter in this instance as I can not filter the collar table, nor can I filter the interval table. I can (and do) create a sub-set of the drill hole database based on this flag and use that, but that can get messy with multiple copies of various filtered subsets of the database, easier just to be able to filter the database/collar/interval files to start with.

Anyway - I think we have stolen the thread...Sorry! As per other comments Wendz - you need to ensure you have clean and ordered data first. Lots of tools are available in MM to do that, but sometimes several hours of pain and tedium are the price you pay for usable data! 



Ronald I agree with your comments,  all functions which use the drillhole DB should allow a collar filter and a filter on any referenced downhole files.  This ability is an enormous help in reducing the number of data sets you have in a project and the associated possibility (probability) of things getting out of sequence.

Going back to the original query.  As a general comment, "Excel is not a Database" and while we have all done it it is not the place to store data, neither is Micromine for that matter.  Both are data manipulation tools.  Source data should be kept in a well constructed and managed database, yes its expensive and time consuming but the value of the data it contains makes it worthwhile.  


Thank you very much Ron and Keith for sharing your insights, they are very valuable. I totally agree with you! I will log this in our system as an improvement for the missing intervals tools. 

Thanks Ron and Keith, there is no way I can fix 1000's of poorly logged/transcribed GC holes, when I actually only need to filter off the DD holes from the collar file and work with those.

I will have to create a subset DH DB in this case - not ideal.

Hi everyone!

Ron, Keith, we inserted the function to insert filter for collars. Wendz, now you will be able to filter only the valid

Do you guys know if it would be beneficial as well having filters for intervals? Would that improve a lot the tool? Do you know any situation where it would be necessary?


Fast turnaround! Thanks Pedro. Personally I do not see a purpose to filtering the interval file, given the whole point is to fill the whole drill string - filtering the interval table just gives more of the drill string to fill. With the collar filter you are effectively filtering the unwanted holes, leaving you just the holes you want to fill out, any intervals you do not want would be effectively filtered at that point. I suppose you might want to filter negative values etc but for me I create a new field for modelling where all that is taken care of anyway. But I'm not very inventive - maybe others can think of a reason?



Totally agree with your points Ron.  I guess you could use the filter to remove 0 length intervals etc. but you would hope that data would mostly be reasonably clean at this stage.



Hi,  I actually think there is a case for filters on the downhole data.  It is not unusual to see datasets where there are both composite intervals and detailed intervals recorded in the same file or re-logged values etc.  Or just data that has been flagged for a purpose.  Providing a filter on downhole files allows easy selection of data subsets for further work.  

Thank you everyone for your replies! They are very helpful!

I don't see any need to filter the interval file in this case, as surely it would result in overlapping intervals? Filtering by Collar file very useful though. 

Pedro, a couple of things I think are missing from insert missing intervals are

  1. Flag Inserted Records, with a field and value 
  2. Similarly generate default values for certain fields, for example I might want to assign a default value to un assayed intervals of 0.001 or another number.
  3. This is the big one, I'd really like to insert missing intervals split into regular intervals rather than one single interval. The reason is I often see unsampled core in areas one would reasonable expect to contain mineralization. Perhapse it was weakly minerlized and the geologist being a human XRF machine decided to save some $ and not assay it. But but we really want and assay value to smooth the grade transition in the model more accurately. If I can split the inserted missing intervals into 1 m lenghths, accpet 0.5m and add residual to last interval, then I can wireframe assign and calculate distances to WF more easiliy and send the geo a sampling list. At the moment I have to DH composite the assay file with inserted intervals and use a constant field like SampleID to get the same result. 

Hi Richard.

Thank you very much for your insights. I see your point. Today I do exactly the same as you mentioned but I use different tools for that. Insert Missing Intervals, Generate Fields and Compositing. I think leapfrog has a tool that does all of that but it's inside their grade modelling tool, is that correct? 

Hi Pedro,

Yes, LF has a form in the DH database where you can set how you want non-numeric, zero values, negative values, missing intervals and missing values to be handled. They used to allow you to create a lookup table in the old version of LF where you could set all this up once and direct the DH database to the table and automatically handle it, now you have to manually do this for every numeric variable, every time, which in my mind is going backwards - progress is fewer mouse clicks, not more! 

I do concur with Richard, being able to insert a background value or flag, and selecting an interval length could be an advantage at this step.


That sounds useful, negative values in assays is another one that gets me wound up. When I have a messy data set like this I usually keep the original assay column. Let's call it Cu%. I'll then generate Cu%_plot, which has all the good values, overlimits merged in, -ve values changed to half lower DL, etc. I like a third column called Cu%_ins, where I insert the default values for missing intervals, sometimes I have two of these, with different inserted values.

All the tools are there to do this in MM, but requires a bit of hopping between forms or writing a logical expression with a few if elif statements. Which can stretch the mind power on a busy day for an amateur coder. 

Thank you very much you guys for your valuable feedback. I totally understand your point and use multiple tools to do that. We are so close to the software that sometimes we don't realize that something we do daily might have some work arounds that aren't so friendly for clients.

Richard, Ron, I am brining the subject internally to see how and if it would fit in our software for future versions. Some of the feedbacks I am posting on this message. Richard, suggestion 1 and 3 are really awesome. I think for suggestion 2 you have a very good point and some of the points mentioned are in your last comment. Maybe be able to deal with more options like a general tool for data preparation would be very nice. It may be worth creating a data prep tool that not only insert missing intervals but also that deals with a collection of the common issues that need to be fixed to generate data ready to be composited for estimation.  We would also need to look after -999 etc., detection limit assays, Nulls etc. Split missing intervals into regular sizes maybe would be a good idea as well. I don't know when and how it will come to our software's for future releases because the discussion is very new, but it is awesome that we are talking about it and recognizing the needs. If you have any suggestions, please let us know.

Regarding to the insert missing interval tool, we have good news. Developers inserted the option to generate a separate output which is an ad on with filters for collars Conversation here on the forum was key to improve our tool and I really appreciate all your feedback! Follow a screenshot of how the tool is now: