Recommendations
Explore
Contribute
About
Updates
Build
Fund
Research
Measure
Leverage AI
Practice
Recommendations
Build
Fund
Research
Measure
Leverage AI
Practice
Explore
Contribute
About
Updates
← Explore Research Questions
Research Question
How can we design information presentation formats that minimize susceptibility to framing effects?
Related Goals
Assembly designs that are robust to both internal and external manipulation attempts.
Urgent
Related Capability
Resist manipulation
Urgent
Ability to resist manipulation that would decrease trustworthiness, legitimacy or unfairly influence the outcome.
Maturity
Opportunity
Importance
Neglectedness
Dimension: Robustness
Related Existing Resources
Adversarial testing for Generative AI
Google’s guide defining adversarial testing as systematically evaluating ML models against malicious or inadvertently harmful input, covering explicit queries (containing policy-violating language) and implicit queries (seeming harmless but involving sensitive topics). The four-stage workflow inv...
Research