Material Manipulation Machine | TiedUp | End-Effector for Interactive Intersection Detection and Robotic Wire Tying
by clara_made in Circuits > Arduino
240 Views, 0 Favorites, 0 Comments
Material Manipulation Machine | TiedUp | End-Effector for Interactive Intersection Detection and Robotic Wire Tying
Our concept involves automating the repetitive process of wire wrapping around reinforcement bars on construction sites. Instead of using mobile machines, we propose a machine that allows users to place sticks freely, while a robot handles the wire connections.
Aim
We aim to develop an interactive construction method using intersection detection and robotic wire tying. This approach enables automation and customization of rebar-tying techniques and machinery.
This project was conducted by Niki Kentroti, Simon Joller, and Clara Blum as a part of the Computational Design and Digital Fabrication Seminar (CDDF) in the Master's programme Integrative Technologies and Architectural Design Research (ITECH) at the University of Stuttgart during the Summer Semester of 2023.
Supplies
Material to manipulate:
- Timber sticks (approx. 1m long)
- Metallic wire (1.6mm works best in the Extruder)
Robotic Arm:
- 6-axis (in our case, a KUKA 125-2 Robot)
***Disclaimer: as probably most people do not have access to a 6-axis Robot Arm, we recommend checking if there are robotic labs close to home that you could use or making the end effector and adapting it to using it rather as a hand-held machine then and actual End-Effector***
Electronics:
- Arduino UNO and Breadboard
- 2x Nema 17 Stepper Motors
- 2x MG996 Servo motors
- 2x A4988 drivers
- 12V power source
- Camera (we used a Kinect V1)
- 5m USB Cables (to connect to the KUKA Robot)
- Ethernet cable
End-Effector:
- 2x14mm plywood plates (see plans and dimensions in the next section)
- thinner timber elements
- 4x M6 Rods
- M6 Nuts and washers
- 3d Printer Extruder
- Wire clippers
- optional: black paint
- optional, BUT super helpful: a Dremel, to correct details
Softwares:
- Rhinoceros 7 and Grasshopper (with plug-in: Firefly, Peacock, Virtual Robot Framework, Simulacrum (download here)
- KukaVarProxy
- Visual Studio, or any code editor
- Optional(to simulate the robot program): Kuka Office Lite inside a Virtual Machine (VMWare Workstation) and KukaWorkVisual
Setup
- black fabric or surface as background
- 90x90cm timber frame of 20cm height
Reseach & Inspiration
We conducted profound research for the development of this project, do have a look at our inspirations.
The Logic
To achieve our goal, we break down the process into tasks: sensing, robotic movement wrapping, cutting, and twisting.
Process:
The user adds stick elements, and a camera sensor (Kinect V1) captures an image. The image is processed to identify stick intersections. The robot (KUKA) starts from the home position, moves to the updated scan position, moves to the updated intersection point, and the end effector wraps, cuts, and twists the wire.
Phases:
There are three stages in this process: Input, Computing, and Executing. We use KukaVarProxy to communicate with the robot, update positions, and control the target points.
Line Detection and Target Position Generation
After placing a stick, a snapshot is taken to confirm its location. The image is processed using Grasshopper with Firefly and Peacock plugins. A Python script detects new lines and adds them to a list. The bisector of the widest angle between intersecting lines is calculated. The TCP (Tool Center Point) is oriented to avoid collisions.
Circuit and Arduino Code
We use an Arduino Uno, two MG996 Servo motors, two Nema 17 Stepper motors with A4988 drivers, and a 12V power source for our circuit. A hard-coded sequence of calibrated rotations is executed when the End-Effector reaches the intersection point to tie together.
Include all the necessary libraries, we use the BasicStepperDriver.h library but feel free to also work with the AccelStepper or any other that piques your interest.
#include <Arduino.h>
#include "BasicStepperDriver.h"
#include <Servo.h>
Then define your Motor steps and Revolutions per Minute:
// Motor steps per revolution. Most steppers are 200 steps or 1.8 degrees/step
#define MOTOR_STEPS 200
#define RPM 100
Also, you can use the Microstepping, in this case, define it.
// Since microstepping is set externally, make sure this matches the selected mode
// If it doesn't, the motor will move at a different RPM than chosen
// 1=full step, 2=half step etc.
#define MICROSTEPS 1
Connect all the wires according to the Circuit diagram and appoint them accordingly in your code.
// All the wires needed for full functionality
#define DIR_EXTRUDER 2
#define STEP_EXTRUDER 3
#define DIR_GRIPPERTWIST 4
#define STEP_GRIPPERTWIST 5
#define SERVO_CLIPPER_PIN 6
#define SERVO_TWISTFIX_PIN 7
Here you can define different Steps Per Rotation for the two Stepper Motors, and declare these and also the Servo Motors.
#define STEPS_PER_ROTATION 1 * MOTOR_STEPS * MICROSTEPS
#define STEPS_PER_ROTATION_GRIPPERTWIST 1 * MOTOR_STEPS * MICROSTEPS
BasicStepperDriver extruder(MOTOR_STEPS, DIR_EXTRUDER, STEP_EXTRUDER);
BasicStepperDriver grippertwist(MOTOR_STEPS, DIR_GRIPPERTWIST, STEP_GRIPPERTWIST);
Servo clipper;
Servo twistfix;
Afterwards, follows the setup. I add a delay at the end to stabilize the motors before starting the sequence
void setup() {
extruder.begin(RPM, MICROSTEPS);
grippertwist.begin(RPM, MICROSTEPS);
clipper.attach(SERVO_CLIPPER_PIN);
twistfix.attach(SERVO_TWISTFIX_PIN);
delay(1000);
And finally, we come to the loop, where you define the sequence of which movement your Motors should loop through.
void loop() {
//Extruder - 5 full rotations
for (int i = 0; i < 28; i++) {
extruder.move(-STEPS_PER_ROTATION);
}
delay(500);
//Clipper - 90 degrees clockwise and back to 0 degrees
clipper.write(100);
delay(500);
clipper.write(0);
delay(1000);
//Grippertwist - Gripper closes
for (int i = 0; i < 5; i++) {
grippertwist.move(-STEPS_PER_ROTATION_GRIPPERTWIST);
}
delay(1000);
//Twistfix - back to 0 degrees - Release
twistfix.write(0);
delay(1000);
//
//Grippertwist - Twisting
for (int i = 0; i < 6; i++) {
grippertwist.move(STEPS_PER_ROTATION);
}
delay(1000);
//Twistfix - back to 90 degrees - Close
twistfix.write(90);
delay(1000);
//Grippertwist - Gripper opens
for (int i = 0; i < 5; i++) {
grippertwist.move(STEPS_PER_ROTATION_GRIPPERTWIST);
}
delay(1000);
}
Downloads
End-Effector Build
The end effector consists of four main elements:
- Sensing unit: Kinect V1.
- Wrapping unit: wire extruder and guide elements.
- Cutting unit: clipper and gears.
- Twisting unit: gripper mechanism and two-stage rotation system.
Most of the parts were 3d printed in PLA plastic. Additional metallic parts were integrated to allow for smoother rotations and ensure stability.
In the attached files you will find all the pieces, cut the timber and Print the parts to be printed. The assembly goes accordingly as well but keep in mind that sometimes a helping hand is worth it as we tried to keep everything as compact as possible and therefore some parts are a bit tricky.
Also, have a look at this Video that we based our Gripper on, the channel has lots of great mechanisms
Robolab Setup
In our Robolab setup, we use a black background for better stick and intersection detection. Also, we position a 90x90cm black timber frame to place the sticks for the end-effector to reach the intersection without colliding with the ground. The Kinect remains connected to the laptop for image processing.
Robotic Implementation
We still faced some challenges with our Implementation. The communication between Arduino and KUKA was not ideal and only solves over a cable. Also, the clipping and twisting did not work as well as expected, there is still room for improvement, so ...
Your Go! Try It Out!
It is a rather complex build and as mentioned before access to a 6-Axis Robot is not a given. Nevertheless, we hope you enjoy this project and find some inspiration or even make or improve it.
If you have any questions, we are happy to help :)
What Happens Now?
With our project, we face challenges of
- 3D structures
- Adding a microphone or “gesture” sensor for improved Human-Robot Collaboration
Nevertheless, we see Potential in
- Developing several mechanical actuation
- Fine-tuning the actuator sequence
- Solving the Clipping as it does not fully work yet
- Adding Wifi to the Robot-Arduino interaction