beah / README.md
*be'ah* MSW Detection & Classification Kit
be'ah MSW Detection & Classification Kit
This is the repository for the Waste Management System for be'ah.
Problem Statement
Oman Environmental Service Holding Company S.A.O.C (be'ah) aims to automate the inspection and monitoring of Municipal Solid Waste (MSW) bins by creating an MSW Detection & Classification Kit (MDCK). This is to enhance the existing manual inspection and monitoring activities, operational oversight, and customer satisfaction levels. MDCK is an integrated end-to-end solution that will automatically monitor, inspect and report bins for service level key performance indicators (KPI) defined by be'ah where each of these KPIs is measured against each garbage disposal bin. The KPI's mentioned are:
| KPI | Name | Definition |
|---|---|---|
| WC1 | Bin Overflow | Waste container not over flowing so that the lid cannot be closed |
| WC2 | Hazard | CCP accessible and not a hazard to vehicles or pedestrians |
| WC3 | Bin Damage RFID | Containers operate, have RFID & no significant damage |
| WC4 | Bin Branding | Containers branding in place to meet be'ah guidelines |
| WC5 | Clean Bin | Containers clean appearance and hygiene & no odours |
| WC6 | Bin 2 Meter | The ground surface within a 2 metre radius of the waste container shall be totally free of waste materials and residues |
| WC7 | Bin 20 Meter | The ground surface within a 20 metre radius of the waste container shall be free of waste materials and residues |
To measure this, be'ah has had to deploy employees to go around to every single garbage disposal bin, and manually measure whether they have been successful in meeting their KPI's. This means that if there is a failure, the employee would have to take a picture and manually write an incident report stating exactly what the failure was.
Solution
The proposed solution system created in this repository is a wholly automated IoT system. This system proposes attaching cameras on all of be'ah's waste collection trucks, and having automated detection and classification machine learning algorithms decide whether the KPI's have been met. If a failure in the KPI's is detected, the incident report is shown on a web-based dashboard which shows exactly on which bin the KPI failure has been detected, and in what KPI(s) it has failed.
The advantage of such system is that it is wholly automated, requiring zero human interaction and thus creating huge savings in resources for be'ah in the long term.

System Overview
The system has 2 main parts.
- The Edge Device
- The Application Server
The Edge Device
The Edge Device is located on the vehicle and connected with the cameras onboard. This device is responsible for processing the images taken by the cameras and send the images along with other metadata to the Application Server. To accomplish this the Edge Device has a few different components.
RabbitMQ
RabbitMQ is used as the main message broker for the Edge as the cameras will be taking pictures continuously and a stream of data will be passing through the Edge. So it will push all the requests to a queue and then process them one by one as the Application server may not be able to handle large amount of requests at once.
Edge Server
The Edge Server is processing all the images and and passing them to the Machine Learning models for Detection and Classification. If the photo has a bin in it then the Edge Server will send the photo along with the timestamp, geolocation and failed KPI(s) to the Application Server to report incidents.
Machine Learning Core
The ML-Core container is an image processing service that takes frames from a RabbitMQ message queue "request_stream" and applies the YOLOv5 object detection model to identify plastic 1100L waste bins.
Once a bin is detected, the ML-Core container utilizes three classification models to classify the type of bin as WC1, WC6, or WC7.
The ML-Core container returns the results of the object detection and classification processes to the "results_stream" in a structured JSON format. The structure of the JSON response is as follows:
{
"metadata": {
"geolocation": {
"longitude": 58.367536066666669,
"latitude": 23.582551616666668
},
"timestamp": "May 25, 2023 at 8:10am (GST)",
"regi_no": "1234ab"
},
"image": {
"type": "base64",
"data":"base64 string omitted...."
},
"kpi": ["wc1", "wc6", "wc7"]
}
Note: the failed KPIs are only returned in the results,therefore, if the results include an empty KPIs (example:
"kpi": []) then all KPIs are passed
The Application Server
TODO
Contribution
Prerequisites
Before contributing (or running) this project, make sure that you have done the following:
- Set up
~/.netrcfile to pullrihal.tech/foundationlibrary. Click here for instructions. - add
export GITHUB_LOGIN=YOUR_GITHUB_USERNAMEandexport GITHUB_TOKEN=YOUR_GITHUB_TOKENto~/.bashrc - Set up the
~/.npmrmfile to pull private NPM packages. Click here for instructions.
Running the web app for the first time
Run the following command from the root of the repo: make run
To stop the containers make stop
Credentials
Test User
username: admin@beah.om
password: admin123
Common Issues
If any problem arises when running the project, the first thing you should do is check the logs:
$ docker-compose logs -f <service-name>
Example:
$ docker-compose logs -f server