The project
ALA — Advanced Laboratory Automation — is a joint research-and-development project between the INPECO Group (world leader in clinical-laboratory automation) and the BioRobotics Institute at the Scuola Superiore Sant’Anna in Pisa, with a total investment of EUR 3.5 million and a team of 30 researchers.
The project tackled three main research lines:
- Software — Deep-learning-based skin-lesion detection: superhuman-performance benchmarks, efficient DL architectures via optimisation techniques (AdaNet, AmoebaNet, NasNet), classification robustness through training regimen, data augmentation and generation
- Hardware — Integrated multimodal sensing and DL inferencing device: general system layout, sub-module design, infrastructural integration
- Full traceability of biological samples and clinical data across the entire laboratory-automation chain
The approach was incremental: starting from implementing a state-of-the-art (SoA) model, then moving beyond it with proprietary techniques, following an iterative development paradigm — from prototypes to stable code bases.
The working group
- INPECO Group — client, world leader in pre-analytical and post-analytical laboratory automation
- BioRobotics Institute — Scuola Superiore Sant’Anna — Computer-Integrated Technologies for Robotic Surgery Laboratory, Surgical Robotics and Allied Technologies Area
- noze — AI and cloud software architecture, data pipeline, containerisation, web app and mobile app
- Dermatology Centres of Siena and Livorno — clinical validation, dermatological datasets
noze’s role
Stefano Noferi (noze) took part in the project from 2019 to 2023 as AI and cloud software architect — initially through the BioRobotics Institute at the Scuola Superiore Sant’Anna and subsequently in direct collaboration with the INPECO Group — responsible for the design and implementation of the entire software infrastructure.
1. Cloud and on-premise architecture for the AI pipeline
Design and implementation of cloud (Azure) and on-premise systems for the full AI pipeline: data ingestion, pre-processing, training, inference and model distribution. The architecture — designed with a modular, microservices approach — enabled the collection of pseudonymised data from the various dermatology centres and the centralisation of processing, segmentation and classification, with a clean separation between Frontend GUI, Backend (ANN Model Class) and Patient-data API.
2. Data management and retraining
Implementation of data-management flows for three distinct data types:
- Anamnestic data — patient clinical information collected via Azure backend, processed by the neural engine for anamnestic-risk assessment
- RGB images — frames acquired via smartphone camera API, classified by the neural engine for image classification with lesion segmentation and classification
- Point data and multimodal signals — from spectrophotometers and other sources, stored in the RAW data archive
Continuous retraining pipeline for models with new clinically validated data from dermatologists.
3. Containerised infrastructure
Deployment via Docker containers with Redis orchestration (Redis Director + Redis Cache) for caching and routing inference requests to the neural models. SoA trunk architecture for patient-data persistence and retrieval.
4. Web and mobile interfaces
Development of a dedicated web app for dermatologists (React) for viewing classifications, reviewing AI-assisted diagnoses and managing training datasets. Development of a patient mobile app (React Native) for guided lesion-image capture and report consultation. Python/Django backend with REST APIs.
Activity conducted in collaboration with the dermatology centres of Siena and Livorno for clinical validation of results.
Technologies and approach
Deep Learning with CNN networks for classification and segmentation of dermatological images. Azure as the cloud platform, Docker for containerisation, Redis for caching and orchestration, Python/Django and REST API for the backend, React and React Native for the interfaces. Python/TensorFlow stack for neural models. Modular architecture inspired by Industry 4.0 principles with centralised management of pseudonymised data and end-to-end traceability of samples.