# Software calibration of Gcode based on physical measurements

Hi,

I have been searching of a solution to automatically correct the linear errors of my Shapeoko XXL and I couldn’t find anyone that created a simple software that corrects a Gcode file based on a machine’s behavior.

Typically I have seen that my machine had slightly different X and Y steps and that the machine was impossible to square easily. These characteristics are easy to correct mathematically using an inverse bilinear transformation. Since I couldn’t find anything I decided to write a Python script that does the job. It took me a little less than a week to debug everything and it works now.

Starting in a situation where 600mm was in reality 599.5 in X and 601 in Y (which is not bad) and the diagonals of a 600 x 600 mm square were 851 mm and 846.5 mm (which is not very good but really not uncommon I bet among Shapeoko users - the “ideal diagonal” should be very close to 848.5 mm), a single use of the script perfectly corrects:

• X and Y deviations
• unsquareness of the machine

The observed differences between the CAD model and the output virtually disappear in my case, which is awesome.

It would be really nice if the people at Carbide 3D could implement this within a future version of Carbide Motion - how should I pull such a request?

How does it works?

Basically you will take a small bit (for ex 2mm diameter) and you will drill 4 points on your wasteboard - note that your wasteboard should have been surfaced first, in order to correct for Z errors due to uneven wasteboard.

We call these points a, b, c, d:

d__________________c
| |
| |
| |
| |
| |
a__________________b

a & d sould be aligned with the machine’s Y axis

coordinates are:

A = 0, 0
B = 600, 0
C = 600, 600
D = 0, 600

Drilling these will allow us to measure the machine’s real output. My script requires the Gcode file to be set at the origin of X and Y, beware if you set your origin in the middle you will have negative values and it will crash (but this could be handled if someone helps rewriting this code)

Now we can measure the following distances:

ab (side)
bc (side)
ac (diagonal)
bd (diagonal)

The script will reconstitute the real output and calculate a correction to be applied to the Gcode.

Because it is impossible to apply linear corrections to arc and circles (G2 & G3 commands), the script will first “linearize” the Gcode (transform all G2/G3 into G1), then it will apply the correction to every X and Y elements.

Unfortunately I couldn’t write this in Python 3 because I am still behind and using Python 2.7 on my system… but it should work on Python 3 with some minor corrections (like adding parenthesis after print function and stuff like that)

Please let me know if it works for you!

Also, let’s convince the team of Carbide motion to include this into the software, that would be awesome

3 Likes

It’s an ingenious idea, very like to auto-leveling for a PCB. That said:

• adjusting the steps/mm in Grbl is easier
• if your machine has uneven belt stretch/tension on Y it would be better to address that mechanically by getting them to have even tension
• what precision are you using to convert G2/G3 arcs? does it have performance or file-size or quality concerns?

and some questions:

• how does one manage the measurement over such a long distance? Rule of thumb is one wants to be able to measure ten times more accurately than what one is attempting to calibrate. Tim Foreman on the Shapeoko forums borrowed some expensive metrology equipment when calibrating his SO3 early on — not everyone has access to such

My suggestion would be to put the code up on GitHub and see if the bCNC folks would want to add it — it would be a good fit there since they already have auto-leveling and it’s a Python project.

1 Like

Hi Will,

• adjusting the steps is easy, yes, but this is not necessary if you use my protocol
• about belt ension, sure. However adjusting belt tension precisely requires precision tools as well
• in any case, a well-mounted machine is of course better, but squaring the machine is always the most difficult part and this protocol saves the user from the pain of squaring perfectly
• the precision I am using for arcs is 0.2mm, but this is a parameter that can be changed in the program easily
• performance does not seem to change at all
• file size usually gets 10 to 30% larger (it depends on the amount of G2 & G3 to be converted to G1)
• no noticeable difference in quality

and of course a huge improvement on accuracy of the cuts

• measurement on long distance is made using a steel ruler on the surfaced bed, ignoring differences larger than 0.5 or 0.25 mm
• actually, you dont need high precision instruments. When you measure 600 mm, and then make sure your machine will move between 599.5 and 600.5, this means you will have a very small error left for smaller cuts. The trick is: the largest distance you measure, the less error you’ll have in your measurement, even with regular art-workshop tools.

I have tried my best to avoid any typo in my program so the Gcode is safe to use. No problems so far.

I have converted files from Fusion 360 that have 2D contour, 3D contour, Adaptative clearing, and Constant Z, so far everything works perfectly.

I’ll have a look at bCNC but since I am using Carbide Motion, I thought it would be great to have this fuction in the program.

Attached are two files, the one ending by ABC.nc is the direct output of my script.

If you want later today I can post a picture of the part that I am currently cutting.

example.nc (729.8 KB) example.ncABC.nc (968.6 KB)

Cool experiment!

Do you mind me asking, did you try the correction on pieces of various sizes and located in various places across the wasteboard ?

As someone who spent waaaaay too much time about X/Y calibration and mapping (as in, really going too far), I would be interested to know how well the corrections values hold across different sizes and locations. My own conclusion in the end was to leave this alone, let the X/Y steps to their default values, and deal with precision at project/CAM level (that said, the squaring of my machine is pretty good, so that helps)

3 Likes

Bonjour Julien,

I am currently cutting 80 identical pieces on a sheet of 6 mm MDF, within 3 hours or so I will be able to give you an answer

Nice approach, I’d be interested to hear what @fenrus thinks too.

I’d suggest that a phone and a ruler are all you need to do this accurately and repeatably.

In general any scaling error or backlash that can be taken out of the mechanism before applying the soft calibration is a good idea so I’d suggest set your belt tensions first, especially balancing the two Y belts, then apply the soft corrections.

If you don’t set and check the belt tensions you won’t know when they start to drift, your soft calibration will drift with the physical system and you won’t get any warning of belt failures.

Indeed, squaring is a problem, but there’s a repeatability problem which you may not have spotted yet. The X beam sits at whatever angle it started up at which is quite variable, the machine is not able to pull the X beam square as the two Y motors run in lock step and there’s a single Y homing switch.

To use your adjustments repeatably I would suggest shiming the X beam to sit square enough on it’s own and then pulling it back to the rear stops before powering up the machine to be able to start with the same X beam angle you calibrated from.

I agree with this, I use a long steel ruler, a pointed engraving bit and a magnifier (phone camera zoomed in) and also calibrate X steps and Y steps over 600mm, otherwise the smaller errors that vary over distance are near impossible to see the macro scale calibration through.

So a user who moves their machine regularly, there are some with a need to do this, could presumably run a fairly rapid re-squaring in software after each move?

Please, please, use a testing framework to write unit and end-to-end tests to actually validate what’s happening, please don’t write, use or distribute un-tested or hand-tested code, that’s the road to pain for everyone.

3 Likes

Most importantly, what kind of testing framework would you suggest?

I routinely copy/paste original VS modified code into ncviewer in order to make sur there’s nothing wrong with the conversion.

1 Like

Cool idea… and yeah one needs to break arcs up (I do the same in my tooling).

the self leveling comment is an interesting one… if one had a super accurate Z probe one could measure, rather than level, the wasteboard and compensate for it throughout.

the X/Y stretch is apparently not quite universal throughout the whole bed, so for the super precision folks this should likely also take the workpiece zero into account (see the “how to make a round hole” thread from last week)

1 Like

One idea would be to use the same approach over a chessboard division of the wasteboard, with more points (9 or 16 points for ex, this could be automated somehow). Do you have any data regarding XY stretch? Is there a kind of pattern? more stretch towards the edges or the center?

For the sub-division thing — just machine V-carved circles at the grid intersections, then prick them w/ a more acute set of calipers to measure — it’s pretty fast and quite repeatable.

@Julien

It’s hard to tell if there are such differences with the pattern which I have created because it is made of triangles. Not easy to get accurate measurements with a caliper when such shapes are in MDF…

When I’ll have time I will try on a more dimensionally stable material

What I can do so far is an overall measurement:

length at the bottom of my MDF 379.75 mm
at the top in-between 379.5 and 379.75 mm
in the CAD file 380 mm

height left 496.5 mm
right 496.5 mm

some diagonal 1 on MDF 624.5 mm

some diagonal 2 on MDF 625 mm

These results are insanely better than any other form of calibration I have tried before. So at least I can say that this works for me

2 Likes

so for bonus points… your script could suggest a new GRBL settings value

1 Like

@fenrus actually it’s true. Then the only thing left to correct would be the deviation from the right angle. Something that the hardware may be able to compute in real time maybe???
But maybe the hardware of the S3 is already powerful enough to compute an inverse bilinear operation?

isn’t the homing function enough to get repeatable results in this regard?

One thing is true, and a bit problematic: if I change the origin of the Gcode, the whole thing will be scaled, proportions will be good, but it may get a tiny bit smaller or larger, depending on the XY deviations. A good thing would be to use machine coordinates to do the calibration over the largest range possible (like 720 x 720 mm for a S3 XXL) and then take the origin point and the machine’s origin into account for recomputing the files.

I wish, but GRBL 1.1 already uses 99.9% of the resources of the ATMega328 microcontroller, it’s pretty hard to add anything meaningful in the code at this point.

Nope, unfortunately not, that’s one of the problems that most visitors to this rabbit hole run into on their way to tea with the mad hatter…

Aha, I was going to ask “what language” but then I went and checked your github and it’s in nice civilised Python so I can help with that.

I’m going to say Python’s unittest or pytest and both PyCharm and Visual Studio have built in logic to help run those test frameworks.

Some simple “load file, process, write, compare with test target” end to end tests will make life a lot easier for you.