Obfuscated Options
A dark pattern where options to opt out or cancel services are deliberately hidden or made difficult to find.
A dark pattern where options to opt out or cancel services are deliberately hidden or made difficult to find.
UI/UX design tactics that intentionally manipulate users into taking actions they might not otherwise take.
A dark pattern where users are tricked into confirming a subscription through misleading language or design.
The design of products, devices, services, or environments for people with disabilities or specific needs.
Trust, Risk, and Security Management (TRiSM) is a framework for managing the trust, risk, and security of AI systems to ensure they are safe, reliable, and ethical.
A dark pattern where questions are worded in a way that tricks the user into giving an answer they didn't intend.
A dark pattern where options that benefit the service provider are pre-selected for the user.
A strategic framework that designs user experiences to guide behavior and decisions towards desired outcomes.
Human in the Loop (HITL) integrates human judgment into the decision-making process of AI systems.