Fitting room from the future !

Project Brief: 3D Fashion Filter for Snapchat

Concept:

The project involves creating an innovative Snapchat filter that transforms the user's clothing into stylish 3D fashion models in real time. This filter will not only replace the user's current attire with a range of fashionable outfits but will also enhance the experience with captivating visual effects and a curated selection of music. The idea is to blend fashion, technology, and entertainment, offering users an immersive and interactive way to explore and showcase different fashion styles.

Key Features:

  1. 3D Fashion Transformation: Utilize advanced image recognition and 3D modelling to accurately map and replace the user's clothes with 3D fashion models. This includes dresses, suits, casual wear, and more, catering to diverse tastes and occasions.

  2. Dynamic Visual Effects: Integrate visually appealing effects that complement the chosen outfit. This could include shimmering, colour changes, or seasonal themes like falling leaves or snowflakes.

  3. Music Integration: Each outfit comes with a matching music track or sound effect, enhancing the overall experience. The music changes with the outfit, offering a range of genres from upbeat pop to elegant classical tunes.

  4. Interactive Experience: Users can switch between outfits with simple gestures or voice commands, making the experience interactive and engaging.

  5. Social Sharing: Seamlessly integrated sharing options to allow users to showcase their fashion transformations on Snapchat and other social media platforms.

Why This Meets the Brief:

  • Innovative Use of Technology: By leveraging Snapchat's AR capabilities, the filter introduces a unique way to engage with fashion using cutting-edge technology.

  • User Engagement: The combination of 3D fashion, visual effects, and music creates an engaging and entertaining experience, encouraging users to spend more time interacting with the filter.

  • Market Relevance: The filter taps into the growing interest in digital fashion and AR experiences, appealing to a tech-savvy audience that values creativity and personal expression.

  • Social Sharing Potential: With built-in sharing functionalities, the filter is designed to encourage viral content creation, increasing its visibility and usage.

  • Customization and Variety: Offering a range of outfits and music caters to a diverse user base, making the filter appealing to a wide demographic.

Conclusion:

This Snapchat filter project merges fashion, AR technology, and music to create a novel and engaging user experience. It aligns with current digital trends and has significant potential for widespread appeal and social media impact.


Creating a Snapchat filter for fashion 3D cloth models, aligns well with the latest trends and advancements in AR technology, especially in the realm of fashion and retail. Snapchat has been at the forefront of these innovations, with a strong focus on AR try-on technology.

Snapchat's Lens Studio 4.0, for instance, introduces the 3D Body Mesh feature, enabling more realistic and dynamic clothing simulations in AR. This technology addresses the challenges of rendering clothing in three dimensions, considering factors like the different folds and angles of fabric, as well as body tracking​​​​.

Fashion brands have been increasingly utilizing AR try-on features for their products. For example, Marc Jacobs and Dior have launched Snapchat and Instagram filters, respectively, allowing users to virtually try on items like handbags and hats. These filters offer a natural look, as if the user is wearing the accessory, rather than just overlaying it in AR.

In summary, the project fits perfectly into the current landscape of AR and fashion technology. Snapchat's advancements in AR, combined with the growing interest from both consumers and retailers in virtual try-on experiences, provide a solid foundation for the Snapchat filter project. The technology is not only enhancing the user experience but also aiding businesses in adapting to the new digital fashion era.

Time table

day's

PART 1:

Preparation for the presentation and familiarization with the capabilities of Lens Studio, including its strengths and weaknesses.

PART 2:

Raw code build

  1. System for full body tracking. Constants and Variables:

  • onFoundTrigger, onLostTrigger: Events triggered when full body tracking starts or stops.

  • MAX_BODY_SIZE, CENTER, MIN_LOST_FRAMES: Constants for defining maximum body size, the centre point, and minimum lost frames.

  • FULL_BODY_IDs, MAIN_POINTS_IDs: Arrays holding IDs of different body points used for tracking.

  • FullBodyTracking Function:

    • Initializes tracking for each body point defined in FULL_BODY_IDs.

    • Creates a special 'Hip' point as the median of the left and right hips.

    • Includes methods for checking if the main body points are being tracked and for calculating the body size.

  • Tracking State Machine:

    • Defines different states (NONE, TOO_CLOSE, TRACKING) for the tracking system.

    • Implements a state machine pattern to handle transitions between these states based on tracking data.

  • Event Bindings:

    • Binds functions to different events (TurnOnEvent, UpdateEvent, FaceFoundEvent, FaceLostEvent) to handle changes in tracking state.

  • State Change Callbacks:

    • Functions like onTrackingEnter, onTrackingExit, and onTooCloseEnter are called when the state machine transitions to a new state. These functions handle events like triggering custom actions or showing/hiding hints.

  • BodyPoint and MedianBodyPoint Functions:

    • Define classes for individual body points and a median body point (like the 'Hip' point) to facilitate tracking and position calculations.

  • StateMachine Class:

    • Implements a state machine with functions to add states, set the current state, and get the current state.

  • Helper Functions (debugPrint, checkInputs):

    • Provide utility for debugging and checking if necessary inputs and scripts are set up correctly.

2. Defining and detecting custom body gestures based on the positions and rotations of various body parts.

Constants and Bone IDs:

  • FULL_BODY_BONE_IDs: Lists the IDs of bones (body parts) used for gesture recognition (like arms, legs, spine).

  • VEC2_UP, PI: Defines a vector pointing upwards and the mathematical constant π (Pi).

  • Pose Library:

    • A collection (PoseLibrary) of predefined poses with specific angles for different bones. Users can add custom poses by recording and copying angles from the logger.

  • Bone Function:

    • Defines a Bone class to represent each bone with start and end points, and a parent (for hierarchical bone structures).

    • Includes methods for tracking and calculating the rotation of each bone.

  • Bone Initialization:

    • Creates instances of the Bone class for different body parts like upper arms, forearms, upper legs, lower legs, and spine, using points from the FullBodyTracking object.

  • Pose Detection Functions:

    • getCurrentPose: Calculates the current pose by getting the rotation of each bone.

    • printPose: Prints the current pose to the console if the body is being tracked.

    • poseFromJson: Creates a pose object from a JSON string.

  • Pose Matching:

    • isMatchingPose: Checks if the current body pose matches a given pose within a specified threshold.

    • getDifference: Helper function to calculate the angular difference between two angles.

  • Gesture Recognition:

    • getGestureFrames: Retrieves a sequence of poses (a gesture) from the pose library.

    • Utilizes the predefined poses to recognize specific gestures based on the current body posture.

  • Helper Functions and Initialization:

    • debugPrint: Utility function for logging messages.

    • Initialization for the tap event to trigger printPose and setting up API methods for external access.

3. Movement or gesture recognition:
Initial Setup:

  • CENTER: A constant representing a zero vector, likely used as a reference point.

  • Variables for tracking movement start, delays, and keeping pose times.

  • Check if the necessary components (FullBodyTracking, behaviorSystem) are available.

  • Movement Type and Gesture Handling:

    • Determines if the movement is a multi-frame gesture (more than one gesture in sequence).

    • Selects the appropriate matching function based on script.moveType (e.g., gesture, touch screen trigger, distance check).

  • Gesture Matching:

    • The getMatchingFunction function returns a closure that encapsulates the logic for checking if a current movement matches a predefined gesture or other criteria (like touching the screen or distance between points).

  • Update Event Binding:

    • Binds the OnUpdate function to the UpdateEvent, which is likely called every frame or at regular intervals.

  • Update Logic:

    • Check whether the current movement or pose matches the predefined criteria.

    • Handles the logic for starting and ending movements, including delays and gesture sequence handling.

    • Updates curFrameIndex to track progress through multi-frame gestures.

  • Start and End Functions:

    • start: Triggered when a movement starts, sending custom triggers and initializing variables.

    • end: Triggered when a movement ends, sending completion triggers and resetting variables.

  • Gesture Matching Logic:

    • isMatchingGesture: Checks if the current pose matches the current frame of a gesture sequence.

  • Debug Printing:

    • debugPrint: Utility function for logging messages, controlled by a debug flag.

4. Actions trigger various visual and animation effects:


Initial Disabling of Trigger Objects:

  • Iterates over script.triggerObjects and disables each of them. This is likely a setup step to ensure that the trigger objects are inactive until a specific event occurs.

  • OnTrigger Function:

    • Activated when a certain trigger is detected.

    • Enables all script.triggerObjects.

    • Starts animations for each object in script.triggerTweens using global.tweenManager.

    • Plays animated textures in script.triggerAnimatedTextures.

    • Initiates particle effects in script.particleBursts.

  • OnTriggerEnd Function:

    • Handles the logic for what happens when the trigger ends.

    • If there are any tweens (animations), it starts the fade-out animation and calls OnTriggerEndTweenEnd upon completion.

    • If there are no tweens, it directly calls OnTriggerEndTweenEnd.

  • OnTriggerEndTweenEnd Function:

    • Disables all script.triggerObjects.

    • Stops any animated textures in script.triggerAnimatedTextures.

    • Stops particle effects in script.particleBursts.

  • Behaviour System Integration:

    • Checks if global.behaviorSystem is available.

    • Adds custom responses to script.startTrigger and script.endTrigger, linking them to OnTrigger and OnTriggerEnd functions, respectively.

  • Debug Print Function:

    • A utility function, debugPrint, for logging messages to the console, likely for debugging purposes.

PART 3:

Adjusting and refining the 3D Try-On Pack available from the Lens Studio Asset Store and investigating the potential for incorporating external 3D models.

PART 4:

Incorporating the 3D models supplied by the artists and meticulously adjusting the trigger points.

PART 5:


Preparation of the Portfolio.

A walk-through video of the application.

Reflection on the Snapchat 3D Fashion Filter Project

Reflecting on the Snapchat 3D Fashion Filter project:

I believe we largely met our initial goal. Our objective was to create an innovative, engaging, and user-friendly filter that allows users to try on virtual fashion items. The successful integration of 3D clothing models, accompanied by captivating visual effects and music, provided a unique and immersive experience for users, aligning well with our vision.

Challenges Faced:

The most challenging part of this project was ensuring the accuracy and realism of the 3D clothing models. Developing a technology that could seamlessly adapt these models to different body types and movements in real-time required extensive research and testing. Another challenge was optimizing the filter's performance to ensure it runs smoothly on various devices without compromising on the quality of the experience.

Teamwork and Distribution of Work:

Regarding the distribution of work, our team dynamics were mostly positive, with each member bringing unique skills to the table. However, there were instances where the workload could have been more evenly distributed. Some team members took on multiple responsibilities, leading to occasional imbalances. In future projects, a more structured approach to task delegation and regular check-ins on workload distribution could enhance our efficiency and prevent burnout.

Concluding Thoughts:

Overall, this project has been a valuable learning experience. It not only pushed the boundaries of AR technology in fashion but also highlighted the importance of teamwork, effective communication, and the ability to adapt to challenges. The positive reception of the filter upon launch is a testament to the hard work and dedication of our team. Going forward, we plan to incorporate user feedback to further refine and expand the filter's capabilities, ensuring it continues to lead in innovation and user engagement.