eBUS Studio
Simplify AI and Machine Vision Application Development
eBUS Studio reduces complexity, cost, and time for manufacturers, integrators, and imaging device suppliers designing and deploying AI and machine vision applications for automated quality inspection.
The intuitive “low code” software development platform, built on Pleora’s industry-leading eBUS SDK, delivers unique capabilities to connect and configure imaging devices, deploy custom, open source and third-party machine vision and AI inspection applications on a flexible range of processing options, and seamlessly integrate with critical manufacturing floor systems and processes.
Connect and Communicate
Our unique open connectivity approach ensures zero vendor lock-in, giving you full flexibility to choose the components best-suited to your application while managing fewer tools. Enjoy full compatibility with third-party image sources (including GigE/USB3 Vision & UVC cameras and photo & video files), out-of-the-box integrations with programmable logic controllers (PLCs), and two-way communication with ERP/MES systems. Built-in eBUS SDK performance simplifies image acquisition, device control and configuration, while ensuring optimal performance.
Low-Code Development
You don’t need programming expertise to develop inspection applications with eBUS Studio. Start developing your own machine vision, AI, and hybrid MV/AI applications with intuitive low-code block-based tools. Tailor pre-packaged quick start templates. Leverage integrated support for open source or third-party machine vision libraries to easily import, customize, and generate performance-optimized vision and machine learning code that can be redeployed across multiple projects.
Train and Test
eBUS Studio significantly simplifies AI training and testing, with full customization for required inspection accuracy. For classification applications, Pleora’s unique approach automatically labels raw or processed machine vision data. This means you can build a customizable data set to help speed AI training, without needing to source numerous good and bad product images. With a user-friendly tool, designers can quickly annotate image for AI-based object detection processes. Applications can then be tested on any web-browser before deployment to speed development and time-to-market.
Scalable, Future-Proof Deployment
Applications developed in eBUS Studio can be deployed on a range of computing and processing devices, including x86 and ARM-based edge platforms. Processing flexibility lets you “develop once” and deploy applications across multiple enduser sites, regardless of their processing choice. Integrated MQTT and OPC-UA low-code blocks help ensure future-proof Industry 4.0 compatibility.
Key Features
- Standards-Based Connectivity — Built on Pleora’s industry-leading eBUS SDK, you can control, configure, and acquire images from any machine vision camera, or access test images and mp4 video files, all from within the browser-based user interface.
- Factory-Floor Integration — Programmable Logic Controller protocol support (Ethernet/IP, PROFINET, ModBus RTU, Modbus TCP), MES/ERP communication (REST API), and Industry 4.0 connectivity with OPC-UA, MQTT low-code blocks.
- Streamline Development — Browser-based, low code, visual programming interface for machine vision, AI or hybrid MV/AI applications. Quick start templates for common inspection requirements including measurement, reference image comparison, defect and object detection, sorting, counting, and more. Built-in version control for applications and scripts.
- Python Support — Import your own custom Python code, open source code, and open source and pretrained models from 3rd party machine vision software and solution providers.
- Intuitive AI Training — Automatically capture and identify images to train Classification models, or capture and annotate images to train Object Detection models all within the eBUS Studio browser-based user-interface.
- Develop, Test and Reuse — Build and test your machine vision and AI application in real-time directly in eBUS Studio’s browser-based application. Re-use code across multiple projects with Python scripts and by creating your own user-defined templates.
- Flexible Deployment — Deploy your machine vision, AI or hybrid MV/AI application directly from eBUS Studio’s browser-based user-interface to a wide variety of runtime options, including NVIDIA Jetson Edge devices as well as industrial x86 PCs.