Have a question?  Maybe you'll find the answer here in our FAQ's.  If you still have questions, just contact us and we'll be happy to help!  You may also want to check out one of our blogs.

 

Does CMMshop.ca ship worldwide?

Yes!  We have most countries listed in our shipping options.  If you don't see yours, don't worry, just send us a quick email and we'll help you out.  Customers outside the US and Canada must provide their own shipping account number.  Click here for more details

I'm not sure what thread size I need.  How can I tell?

Simply measure the outside diameter of the thread.  approx. 2mm will be an M2 thread, approx. 3mm will be an M3 thread, and so on.  Alternatively, you can reference this Thread size guide

Can I exchange something I've purchased from CMMshop.ca?

Absolutely!  We'll exchange anything, anytime, for any reason!  Our goal is to ensure you get what you need, every time.  Just let us know and we'll handle it.

How do I calibrate a disk probe?

To accomplish this manually, take 4 points at the equator of the master ball, then 4 points just above and 4 points just below the equator (12 points total).  The software will use these 12 points to calculate a sphere.  If you're calibrating an 18mm disc, the calibration result will be approx. Ø18mm.  As far as the software is concerned, you've calibrated a sphere, even though you've got a thin section of a sphere.  Check your max. diff, std dev, error, depending on your software.  If it's too high, repeat the calibration.  There are different ways to refine this calibration result in CNC/DCC mode, depending on your software.

How long of a probe/styli can I use?

The rule of thumb is to use the largest ball and shortest shaft you can get away with for maximum accuracy.  Each type of sensor has specific length and weight limits so be sure and check the manual for your specific sensor to be certain you're not using a probe/styli that's too long or too heavy.

What's the best tip/ball material for my application?

This is an important question to ask, especially when using a scanning sensor.  The most common material is ruby (synthetic ruby = aluminum oxide) and is applicable for most applications, however, if you're scanning aluminum, the ruby will attract the aluminum through a  process called "adhesive wear", resulting in a buildup of aluminum on the ruby sphere.  For aluminum scanning applications, we recommend Silicon Nitride (Si3N4 or more simply known as SiN).  Silicon Nitride does not attract aluminum and therefore resists the buildup of aluminum.  If you're scanning hard steel or cast iron, we recommend Zirconia, which exhibits high wear resistance in demanding scanning application.  The price of both Silicon Nitride and Zirconia are slightly higher but both result in longer tool life so the price difference becomes a cost savings in the long run compared to using the wrong ball material over a period of time.  Feel free to ask us if you're not certain about which material suits your application.

What grade ball am I getting?

Grade numbers run from 3 (highest grade) to 3000 (lowest grade).  The grade number identifies, in millionths of an inch, the permissible deviation from a perfect sphere.  A grade 3 ball is no more than 3 millionths of an inch (0.000003") from perfection.  Grade 5 balls are within 5 millionths.  Grade 10 are within 10 millionths, and so on.  We sell only grade 5 or 10 balls on our catalog styli.  Our calibration spheres are all grade 5.  For stylus ball diameters over 15 mm, we offer grade 25.  And for unusual sizes, grade 25 is often the only available.

 

How often should I calibrate my probes/styli?

The answer depends on several factors.

  • How much the CMM is used
  • Environment (cleanliness, temperature stability)
  • Whether any collisions have occurred.
  • Material being inspected
  • Tolerance to be inspected

In any case, you need to figure out what works best for you.  Some calibrate for every program, some calibrate daily/weekly, and some seem to ignore this altogether (not recommended).  We've seen customers who have gone close to a year without calibrating!  The question you should ask yourself is, "how confidant are you that nothing has compromised the last calibration results?".  There's such a thing as overkill, but if you've got the time to calibrate, why not do it, and eliminate that as a potential error in your measurement results.

How accurate is my CMM really?

This is a great question!  There are many factors to consider here.  When the CMM manufacturer states the accuracy of a specific CMM, this is a "best case" scenario where all or most variables have been considered and has been tested in a pristine environment with specialists who know the equipment inside and out.  You didn't really expect to get the same accuracy/repeatability did you?  Well, actually, you did, because that's what they sold you on.  Think of it in terms of an experiment.  You need to identify and control all factors that may muddy the waters.

  • Cleanliness of the probe/styli, calibration sphere and workpiece.
  • Environment (temperature stability, dust, vibration)
  • Calibration of the CMM (how recent & what errors were found)
  • Difference in expansion coefficient between CMM scales & workpiece
  • Type of measurement being tested (single point or calculated geometry)
  • Type of sensor
  • Size, grade & length of stylus used
  • Uncertainty of the measurement

You get the picture.  Uncertainty evaluation is a great exercise, even if you're not an inspection lab who must do this to maintain an ISO17025 accreditation.  I co-founded an inspection/calibration laboratory who maintained an ISO17025 accreditation and I can tell you from experience that the actual vs. stated accuracy for a CMM is vastly different in the real world.  For instance, We had a CMM with a stated accuracy of 3 microns (approx. 0.0001") but in real world conditions, based on a rigorous uncertainty study, delivered a best measurement uncertainty of (200 + 0.9L) µin.  This means the best case scenario, under controlled conditions, using single point measurements, resulted in a 0.0002" accuracy under 1" of measurement, plus 0.9µin per inch thereafter.  It's best to run your own tests under real life conditions to get a handle on the actual accuracy of your equipment, whether it's a CMM, caliper, micrometer, height gauge, indicator, etc.

My calibration results are bad.  What should I do?

If you're still having issues, contact a CMM tech to investigate.