Open Access Open Access  Restricted Access Subscription or Fee Access

Performance Enhancement in a Two-Camera Visual Servoing System Using Artificial Neural Networks


(*) Corresponding author


Authors' affiliations


DOI: https://doi.org/10.15866/ireaco.v16i4.23663

Abstract


Visual sensory information from cameras for robot guidance and feedback control increases the scope of industrial automation. Heterogeneous workspaces and parallel operations may require the data from more than one sensor. This paper illustrates a regulatory scheme in visual servo control, utilizing the cooperation of two cameras for robot manipulation with a switching approach in configuration and oversight. A supervisory camera placed at a convenient location monitors the workspace such that the end effector camera can extend its field of view in Image Based Visual Servoing (IBVS). The robot is steered by a hybrid control law prior to switching to IBVS in its stable and converging region. Selection of gain factors complementing each other ensures smooth changeover in control laws with smooth velocity profile using a qualitative approach. Alternately, trained Artificial Neural Networks (ANN) improve the performance through prediction servoing over regions of uncertainty, multiple switches and multivariable loop interactions.
Copyright © 2023 Praise Worthy Prize - All rights reserved.

Keywords


Artificial Neural Network; Computational Intelligence; Machine Vision; Regulatory Control; Robot Vision; Visual Servoing

Full Text:

PDF


References


Chaumette, F., Hutchinson, S., Corke, P. (2016). Visual Servoing. In: Siciliano, B., Khatib, O. (eds) Springer Handbook of Robotics. Springer Handbooks. Springer, Cham.
https://doi.org/10.1007/978-3-319-32552-1_34

X. Sun, X. Zhu, P. Wang and H. Chen, A Review of Robot Control with Visual Servoing, 2018 IEEE 8th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), Tianjin, China, 2018, pp. 116-121.
https://doi.org/10.1109/CYBER.2018.8688060

F. Yan, B. Li, W. Shi and D. Wang, Hybrid Visual Servo Trajectory Tracking of Wheeled Mobile Robots, in IEEE Access, vol. 6, pp. 24291-24298, 2018.
https://doi.org/10.1109/ACCESS.2018.2829839

Corke, P., Hutchinson, S., Gans, N. R. (2002). Partitioned image-based visual servo control: Some new results. Lecture notes in computer science, 122-140.
https://doi.org/10.1007/3-540-45993-6_8

P. I. Corke and S. A. Hutchinson, A new hybrid image-based visual servo control scheme, Proceedings of the 39th IEEE Conference on Decision and Control (Cat. No.00CH37187), Sydney, NSW, Australia, 2000, pp. 2521-2526.

C. J. Taylor, J. P. Ostrowski and Sang-Hack Jung, Robust visual servoing based on relative orientation, Proceedings 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149), Fort Collins, CO, USA, 1999, pp. 574-580 Vol. 2.

Mezouar, Y., & Chaumette, F. (2002). Path planning for robust image-based control. IEEE Transactions on Robotics and Automation, 18(4), 534-549.
https://doi.org/10.1109/TRA.2002.802218

Kermorgant, O., & Chaumette, F. (2011). Combining IBVS and PBVS to ensure the visibility constraint. 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2849-2854.
https://doi.org/10.1109/IROS.2011.6094589

Gans, N. R., & Hutchinson, S. A. (2007). Stable Visual Servoing Through Hybrid Switched-System Control. IEEE Transactions on Robotics, 23(3), 530-540.
https://doi.org/10.1109/TRO.2007.895067

Allibert, G., & Courtial, E. (2012). Switching controller for efficient IBVS. 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, 1695-1701.
https://doi.org/10.1109/IROS.2012.6385650

Araar, O., Aouf, N. (2014). A new hybrid approach for the visual servoing of VTOL UAVs from unknown geometries. 22nd Mediterranean Conf. on Control and Automation, 1425-1432.
https://doi.org/10.1109/MED.2014.6961576

Flandin, G., Chaumette, F., & Marchand, E. (2000). Eye-in-hand/eye-to-hand cooperation for visual servoing. Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065), 3, 2741-2746.

Wang, Y., Zuo, B., & Lang, H. (2014). Vision based robotic grasping with a hybrid camera configuration. 2014 IEEE International Conference on Systems, Man, and Cybernetics (SMC), 3178-3183.
https://doi.org/10.1109/SMC.2014.6974417

Guoliang Zhang, Bin Wang, Jie Wang, & Liu, H. (2009). A hybrid visual servoing control of 4 DOFs space robot. 2009 International Conference on Mechatronics and Automation, 3287-3292.
https://doi.org/10.1109/ICMA.2009.5246280

Luo, R. C., Chou, S.-C., Yang, X.-Y., & Peng, N. (2014). Hybrid Eye-to-hand and Eye-in-hand visual servo system for parallel robot conveyor object tracking and fetching. IECON 2014 - 40th Annual Conference of the IEEE Industrial Electronics Society, 2558-2563.
https://doi.org/10.1109/IECON.2014.7048866

Lee, M.-F. R., & Chiu, F. H. S. (2013). A hybrid visual servo control system for the autonomous mobile robot. Proceedings of the 2013 IEEE/SICE International Symposium on System Integration, 31-36.
https://doi.org/10.1109/SII.2013.6776698

Bouhalassa, L., Benchikh, L., Ahmed-Foitih, Z., Bouzgou, K., Path Planning of the Manipulator Arm FANUC Based on Soft Computing Techniques, (2020) International Review of Automatic Control (IREACO), 13 (4), pp. 171-181.
https://doi.org/10.15866/ireaco.v13i4.18506

Setiawardhana, S., Nasrulloh, W., Dewantara, B., Wibowo, I., Prediction of Ball Position in Three-Dimensional Space Using Artificial Neural Networks on Robot ERSOW, (2022) International Review of Automatic Control (IREACO), 15 (6), pp. 285-294.
https://doi.org/10.15866/ireaco.v15i6.22697

El Farnane, A., Youssefi, M., Mouhsen, A., Kachmar, M., Oumouh, A., El Aissaoui, A., Trajectory Tracking of Autonomous Driving Tricycle Robot with Fuzzy Control, (2022) International Review of Automatic Control (IREACO), 15 (2), pp. 80-86.
https://doi.org/10.15866/ireaco.v15i2.21719

Lamini, C., Benhlima, S., Bekri, M., Q-Free Walk Ant Hybrid Architecture for Mobile Robot Path Planning in Dynamic Environment, (2022) International Journal on Engineering Applications (IREA), 10 (2), pp. 105-115.
https://doi.org/10.15866/irea.v10i2.20443

Roa, C., Amaya, D., Ramos, O., Control and Path Planning for Quadrotor Oriented to Agricultural Support in Annona Muricata Crops, (2022) International Review of Automatic Control (IREACO), 15 (5), pp. 222-232.
https://doi.org/10.15866/ireaco.v15i5.21301

Nithya, M., Rashmi, M., Gazebo - Simulink Framework for Trajectory Tracking in a Multi-Quadcopter Environment, (2022) International Review of Automatic Control (IREACO), 15 (4), pp. 164-175.
https://doi.org/10.15866/ireaco.v15i4.21432

Al-Zabt, A., Tutunji, T., Robotic Manipulator Control Using CNN and Deep Q-Network Algorithm, (2022) International Review of Automatic Control (IREACO), 15 (5), pp. 242-250.
https://doi.org/10.15866/ireaco.v15i5.19762

Kuhn, D., Buessler, J. L., & Urban, J. P. (1995). Neural approach to visual servoing for robotic hand eye coordination. Proceedings of ICNN'95 - International Conference on Neural Networks, 5, 2364-2369.
https://doi.org/10.1109/ICNN.1995.487731

Stanley, K., Wu, Q. M. J., Jerbi, A., & Gruver, W. (1999). Neural network-based vision guided robotics. Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat.No.99CH36288C), 1, 281-286.
https://doi.org/10.1109/ROBOT.1999.769989

Cupertino, F., Giordano, V., Mininno, E., Naso, D., & Turchiano, B. (2004). A neural visual servoing in uncalibrated environments for robotic manipulators. 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583), 6, 5362-5367.

Nan-Feng, X., & Nahavandi, S. (2006). Visual Feedback Control of a Robot in an Unknown Environment (Learning Control Using Neural Networks). In S. Cubero (Ed.), Industrial Robotics: Theory, Modelling and Control. Pro Literatur Verlag, Germany / ARS, Austria.
https://doi.org/10.5772/5039

Sunami, Y., Minami, M., & Yanou, A. (2013). Experimental evaluations of prediction servoing to moving object by hand-eye robotic system. The SICE Annual Conference 2013, 1310-1315.

Ramachandram, D., & Rajeswari, M. (2004). Neural network-based robot visual positioning for intelligent assembly. Journal of Intelligent Manufacturing, 15(2), 219-231.
https://doi.org/10.1023/B:JIMS.0000018034.76366.b8

Setiawardhana, S., Nasrulloh, W., Dewantara, B., Wibowo, I., Prediction of Ball Position in Three-Dimensional Space Using Artificial Neural Networks on Robot ERSOW, (2022) International Review of Automatic Control (IREACO), 15 (6), pp. 285-294.
https://doi.org/10.15866/ireaco.v15i6.22697

Ortiz, J. E. C. (2012). Visual servoing for an omnidirectional mobile robot using the neural network-Multilayer perceptron. 2012 Workshop on Engineering Applications, 1-6.
https://doi.org/10.1109/WEA.2012.6220100

Loreto, G., Wen Yu, & Garrido, R. (2001). Stable visual servoing with neural network compensation. Proceeding of the 2001 IEEE International Symposium on Intelligent Control (ISIC '01) (Cat. No.01CH37206), 183-188.
https://doi.org/10.1109/ISIC.2001.971505

Guang-Bin Huang, Yan-Qiu Chen, & Babri, H. A. (2000). Classification ability of single hidden layer feedforward neural networks. IEEE Transactions on Neural Networks, 11(3), 799-801.
https://doi.org/10.1109/72.846750

Product manual-IRB 1200, ABB Robots.


Refbacks

  • There are currently no refbacks.



Please send any question about this web site to info@praiseworthyprize.com
Copyright © 2005-2024 Praise Worthy Prize