米国政府のデザイン原則
これらの設計原則は、政府間のチームが重要な共通の目標に向けて連携し、設計システムをより良く利用するためのものであり、設計と実施を決定するための評価用のレンズとなることを目的としている。どのように構築するかに関わらず、USWDSプロジェクトはこれらの原則をサポートする必要があります。
Start with real user needs / 本当のユーザーニーズから始める
Does your product or service have access to the resources necessary to perform research?
Who is your primary audience?
What user needs will this product or service address?
Do you use personas or other audience segment analysis techniques to connect your solutions to different segments of your audience?
How often are you testing with real people?
Which people will have the most difficulty with the product or service?
Which research methods were used?
What were the key findings?
How and where were the findings documented?
Earn trust / 信頼をつくる
Do users understand that this is a government site or service?
What are the public’s expectations of your product?
What private or sensitive data do you ask your users to provide?
What are you doing to keep that data private?
Does your product utilize redundancy to minimize the effect of server failure or traffic spikes?
Does your product use continuous integration testing to prevent unintended regressions?
Can users to edit or undo actions or edit data they’ve added to the system?
How often do you check that your service works as intended?
What components are made available to the public as open source?
How quickly do you respond to bug reports?
Is your content written in clear, easy-to-follow plain language?
Do you provide meaningful access to people with limited English proficiency?
Embrace accessibility / アクセシビリティの確保
Can users navigate your site using only the keyboard?
Can users use a screen reader to access the page content?
Can users quickly understand the main points of your content?
Can users easily interpret content associated with graphic elements?
Can users easily understand and complete key tasks?
Are you testing your service with a broad range of users?
Do you know your agency accessibility team?
Is your site organized such that everyone can navigate it easily?
Are you using accessibility testing tools?
Did your accessibility testing tools provide accurate results?
Are you providing content in languages other than English, as appropriate for the audience?
Promote continuity / 連続性を促進する
Do you know if your audience understands that your product is a government site or service?
Do you know if your audience understands the purpose of each page or section?
Is it always clear what users are expected to do next?
Does your agency have established style guidance?
Have you tried and tested shared solutions before developing your own?
Have you considered your service in the context of customer or user journeys?
Have you identified your highest-impact customer or user journeys? Within these journeys, have you identified specific opportunities at which to collect ?
Have you considered your service in the broader context of a service ecosystem?
Can you reach across agencies and silos to collaborate and share solutions?
Does your site or service have a consistent experience on any device or browser?
Do users have equivalent access to your information and services on any device?
What factors outside the scope of your product or service affect its success?
What other government products or services are related to the success of your product or service?
Are you able to coordinate solutions with other projects that share a similar audience?
Listen / 聞く
聴衆の声に耳を傾け、聞いたことから学ぶことで、製品を評価し、改善していきましょう。
Does your product or service have access to people with design, development, and research skills?
What are the key metrics your service uses to measure success?
How are your success metrics tied to positive customer or user outcomes?
How have these metrics performed over the life of the service?
Do you have system monitoring tools and processes in place to identify and respond to issues?
Which tools are in place to measure user behavior, and how do you use them?
Do you measure customer satisfaction and take steps to improve satisfaction?
Do you assess your customer experience maturity and develop action plans to identify focus areas for improvement?
How are you collecting user feedback for bugs and other product issues?
Do all members of the project team participate in user interviews and research activities?
Do you cultivate direct community participation in your project with activities like hackathons?
How often are you reviewing and addressing feedback and analytics?
Do you contribute feedback to services your project uses?