We designed DermaWorks to give an accurate diagnosis of common skin conditions as searching the web would often lead to inaccurate results. When searching for a skin issue, most of the images are indistinct and hard to distinguish. With DermaWorks, skin condition diagnosis is also more affordable and convenient as it also acts like a diary, keeping track of your skins progress.
Scans and detects skin conditions from an image Allows customization of user profile Tracks skin condition progress Validates authentic ingredients in skin products How we built it We build DermaWorks using Swift, CreateML for the image classifier model, Figma for design mockups, and Firebase for user logins.
At first, the machine learning model from CreateML was inaccurate. However, as we started to input more images and remove outliers from the data sets, accuracy improved. We also struggled with conflicting constraints between multiple iPhone models.
We managed to designed a machine learning model despite it all being our first time. We also designed our app mockup on Figma, which was difficult to pick up.
We learned how to use CreateML to generate a model for image classification and techniques to improve model accuracy. We also learned how to design mockups on Figma and create tree databases in Google's Firebase.
We plan to improve the machine learning model and move it to a more robust software for object detection, so the model can be trained only on necessary aspects of the image. Furthermore, we plan to add a feature for user contact, connecting them with dermatologists and licensed professionals.