Processors of medical tubing are being required to deliver products to tighter-than-ever specifications on outside and inside diameter (ID/OD), wall thickness, ovality, and concentricity…all while they themselves seek higher line speeds and enhanced material properties.
The main difference between medical tubing and other products is that, in the former, processors are working with tolerances of ten-thousandths of an inch instead of the more typical thousandths of an inch. In response, extruder and melt-pump manufacturers are enhancing their machinery to provide more consistent throughputs, and compounders are working to make the materials as repeatable as possible. Yet advances in puller technology are the silent hero in helping medical-tubing processors improve both product precision and throughput.
The puller is the heart of the tubing extrusion process. While material throughput rate is generated by the extruder, and the tubing is initially formed in the die, it’s the puller that develops the ID/OD and wall thickness. The accuracy of those dimensions—as well as the ovality or “roundness” and overall quality of the tubing—will be determined by the accuracy and repeatability of the puller speed control. Consistency of speed, in turn, is the product of several different variables in the puller drive train, all of which have been the subject of recent intensive research and engineering. It is important to understand how each of these design details contributes to improved puller performance and why buying a cheaper puller may be false economy.
MORE CONTROL WITH DIGITAL SERVOS
Speed control starts with the motor that drives the sheaves and pulleys that move the belts that grip the tubing and pull it through the extrusion line. Most medical processors opt for either closed-loop vector drives or the ultimate: the digital servo drive system.
Closed-loop vector drives have come a long way in control technology, with higher-resolution encoders typically scanning 1024 pulses of light/revolution. They use time-based control (not position-based), and this technology has improved dramatically in recent years due to faster processor speeds. However, the drive motors are the real concern. They typically have large-diameter shafts and consequently are higher in inertia. As a result, when a speed correction needs to occur, it takes longer for the drive motors to make this change. The end result is tubing of inconsistent quality.
The rotating shafts of today’s digital servo systems, on the other hand, are typically small in diameter, which relates to low inertia. They also are crammed with highly efficient magnets that make them very responsive. Servo drives typically use both speed control and positional control with rotary encoders that generate 4200 pulses/revolution (quadrature x 4 @ 17,200). This means that the servo reads and attempts to correct its speed and position 17,200 times/revolution.
The servo speed and position are actually controlled within the revolution of the motor, and the more pulses/revolution the more accurate the control. In addition, because it is a true digital system, a serial operator interface is used in place of an analog potentiometer, so that noise and voltage fluctuations will not cause speed variation. The servo controller digitally controls the speed and position of the servo motor internally. The operator interface is used only to enter setpoints.
TRANSMITTING SERVO MOTOR ACCURACY
With a digital system like this, drive-motor speed variation can be eliminated almost entirely. So the next step in configuring a really accurate and repeatable puller involves transmitting energy from the motor shaft to the puller belts that contact the extruded tubing. Today, this almost always involves a belt-drive system or, even better, a gearbox. With AC invertors, DC drives, and open- or closed-loop vector drives, the load to which the drive is connected does not have much impact on overall system performance, and so this part of the puller is often overlooked. With servo-drive systems, however, the effects can be dramatic and so they demand close attention.
We have learned that servo-motor technology requires torsional stiffness in order to optimize the tuning of the servo motor. Movement of the servo-motor shaft that does not meet resistance—sometimes called dither or backlash—can limit how accurately the manufacturer can tune the system. To get an idea of how much dither/backlash is in a drive system, rock the puller belt back and forth with the power off. Any drive-shaft movement that doesn’t also rotate the drive motor is dither, and it is undesirable in a servo motor because it introduces variability.
A belt-drive system is acceptable for many simpler applications, particularly at speeds under 200 ft/min (61 m/min). Much like an automobile fan belt, a single serpentine belt transmits the drive energy from the motor to sheaves that turn both the upper and lower puller belts. A servo-rated belt should always be used, however, and the belt should wrap around at least 60% of the sheave circumference to minimize slippage.
When higher speeds are needed, or when the puller needs to speed up and slow down to produce tapered or “bump” tubing, a puller with a gear reducer will always offer greater precision than a belt-drive system, and a servo-rated, reduced-backlash gearbox is preferred. These devices were first used in pullers to improve precision in feeding and delivery speed and, thus, to enhance cut-to-length tolerances. Now, it’s been discovered that there is a direct relationship between reduced backlash and repeatability. So, not only do cut-to-length tolerances improve, but with more consistent speed control, tubing OD, ID, and wall thickness also become more consistent.
For best performance, a puller should have independent servo motors to drive the upper and lower belt booms. Each servo motor then direct-drives a low-backlash reducer that is coupled to the belt sheave by a torsionally stiff coupling (no keyways). This configuration will result in optimum servo performance and maximum tubing precision and repeatability.
The belts that actually contact the extruded tubing have also evolved to ensure that the precision of the servo drive and gearbox is faithfully transmitted to enhance tolerances. Flat belts work well for applications requiring high speeds and extreme accuracy, and for many years they were the preferred design. Then tracking issues caused them to fall out of favor. The belts would wander on the sheaves unless constrained by side flanges. These, in turn, tended to wear the edge of the belts, creating particulate that could contaminate the tubing.
To overcome the tracking issue, the industry initially moved toward use of poly-V belts, which are still seen quite commonly today. Triangular ridges, which run parallel to the extrusion line on the inside of the belt, engage with grooves in the driven sheave to make them self-tracking. Poly-V belts became very popular for this reason and because they allowed for a variety of covering materials to be used. However, poly-V belts have been unable to deliver sufficient accuracy—regardless of the drive system—for acceptance in applications that demand extremely tight tolerances. That’s because they tend to be thicker and stiffer than other belts, making it difficult to maintain a consistent grip on the extruded tubing.
A bigger problem still is the potential mismatch between the belt teeth and the machined sheave. Even 0.001-in. differential between the tooth peak-to-peak distances can be amplified over 12 to 32 teeth so that the teeth don’t consistently engage. This condition causes belt wear and particulate contamination. More importantly, it leads to variable slippage between the belt and sheave, making it next to impossible to transmit the accuracy of a servo drive to the extruded tubing. The only way to minimize variable slippage of poly-V belts is to tension the belt extremely tightly so that the belt teeth are literally forced into the grooves in the sheave. This makes the reliability and consistency of the belt-tensioning system critical, and manual tensioning is not recommended.
The preferred belt technology involves timing belts with herringbone-shaped grooves, which are both self-tracking and non-slipping. These belts are positional, due to the way their teeth engage, so they need no side flanges. Particulate contamination is also dramatically reduced. The teeth on the inside of the belt are surface-ground to reduce thickness and ensure consistent thickness in both axes for optimum pulling. The shape of the teeth allow these belts to be used in combination with flat idler rolls, so they deliver many of the same benefits of flat belts but without the tracking issues.
While certain belts perform better than others, there is one puller feature that can help improve the performance of any belt: multiple, well-placed idler rolls. To minimize variable slippage, the contact between the belt and the tubing must be firm and consistent across the length of the belt. To cut costs, however, some pullers have just one or two idler rolls between the drive sheaves; so unless the belt is highly tensioned, its grip on the tubing can vary along its length. Multiple rolls in a solidly built puller can go a long way toward ensuring that the precision of the drive system is transmitted to the belt and the tubing.
With the combination of low-backlash servo-drive systems; torsionally stiff, low-dither gearboxes; independently driven upper and lower beams; and improved belt technology, pullers for medical tubing are able to maintain the tightest tolerances, even in small-diameter and micro-bore/multi-hollow tubing. For producers of larger “commodity” tubing, these same technologies are being adapted to larger pullers so that tighter tolerances can result in material savings.