The document summarizes a presentation given by Carl Erickson and Brittany Hunter of Atomic Object on integrating design and development to create great software. The presentation covered topics like open environments, innovation games, contextual inquiry, mental models, flexible prototypes, pair programming, test-driven development, kanban, and validating assumptions using paper prototyping. It concluded with a question and answer session.
A Journey Into the Emotions of Software Developers
MSU Women in Computing - Integrating Design and Development
1. Integrating Design and
Development to make
Great Software
MSU Women in Computing
November 6, 2012
Carl Erickson & Brittany Hunter
www.atomicobject.com
23. Case Study: Paper Prototypes
@atomicobject http://spin.atomicobject.com 23
24. Conclusion +
Q&A
Carl Erickson & Brittany Hunter
www.atomicobject.com
Editor's Notes
I want to start by telling you a little about our company. Atomic Object turned 11 years old this fall. We ’ re a Michigan company, and we get to work with world-class clients, from small startups to fortune 500 companies. Some of our clients are local, others are worldwide.
We have about 30 developers, designers, and testers, and we work out of two offices. Our first office is in Grand Rapids. the other opened in Detroit a few months ago.
This is a look at our office. Our open environment encourages collaboration and communication between teams and team-members. We don ’ t have a design department and a development department. We work together in integrated, poly-skilled product teams. A team will generally have a few developers and one or two designers. Team members work together and share tasks. Because of the way our office is set up, it ’ s easy to share knowledge and expertise across teams. If somebody else in the room knows more about a topic than you do, you can grab them and ask a question almost anytime.
At Atomic Object, we make software products. We work on many different platforms and technologies, from mobile (iPhone and Android), to web applications, desktop applications, and even development of embedded devices.
Each project starts out in the sales pipeline. Carl, Mike, and Shawn are the Upfront Team. work with potential clients to determine whether Atomic is a good fit for the project, and vice versa. Although their day-to-day work is mostly comprised of working with clients and operationalizing the company, Carl, Mike, & Shawn are also accomplished software developers. Their deep knowledge of the craft really gives us an advantage when estimating and scheduling projects.
When we start a project, we put together a team of designers and developers to work on the project. Then, everybody--designers, developers, and the stakeholders from the client ’s organization--gets together for a meeting called the Project Kickoff. During this meeting, we get to know each other and really dig deep into the question, what problem are we really trying to solve with this software?
During the project kickoff, we do some interesting things to keep things going and keep people talking. During this particular exercise, everybody on the team was asked, “ If this software came in a box, what would be on the box? ” This helped discover some of the key features and goals of the software.
During the project kickoff, we also work together to create a Product Backbone. This is a list of actions that the user will do in the software, and the features within each action.
All of the activities we do during the product kickoff are meant to jumpstart our research and give the project team a basic understanding of the problems that need to be solved. After the kickoff, it ’s time to do more research. When creating complex software, designers and developers must become domain experts in a short amount of time. In order to create useful, usable software, we need to know as much as we can about the industry in which it will actually be used. How do we accomplish this?
One tool we use is Contextual Inquiry. We go to our users -- to their workplaces, and the environments where they ’ ll be using the software. We study their habits and the jobs they do. Being with somebody and watching them do their work gives us information that a simple user interview in a conference room wouldn ’ t give.
Another way to understand your users needs is through mental modeling. A mental model is an explanation of someone ’ s thought process about how something works in the real world. Software that is usable and intuitive will conform to how people think about their domain. This example is from our TagWizard project. The TagWizard is a small device that allows users to find and reserve desks and office spaces. We ’ re currently building a web-based admin tool that will enable facility administrators to set up and organize these devices. In order to build the tool, we needed to understand how facility administrators think about a space. So this is a floor plan of an office space, color-coded to denote different kinds of workspaces. We did site visits at Steelcase to see how their offices are organized, and we used several of these plans to validate the scheme that we designed for organizing TagWizards.
After we do research, it ’ s time to build the product. I ’ m going to talk about our strategies for usable design, and then i ’ ll talk about about how we test our designs.
At AO, teams create a solution together. To get things done, we depend on fast iteration and quick communication cycles. To do this, we often use sketches and storyboards to communicate. They also help us to remember the context that our users will be in as they use the software.
In our teams, because we work under the same roof and (usually) at the same desks, detailed specs and wireframes like these are unnecessary. Real-time communication is much more helpful.
Sketches like these are easier and quicker to create and maintain than big, detailed spec documents and wireframes.
In our integrated teams, each person has some unique skill sets, while other skill sets overlap. Developers take care of the coding. Designers take care of hi-fidelity design: photoshop, the finer details of interaction design, et cetera. There are areas in the middle where we overlap and all work together. When you have intelligent people working together on a product, you get the maximum benefit out of getting as many different perspectives involved in as many different areas as possible. So our designers end up doing a lot of what you might traditionally think of as “ Development ” , delving into markup and javascript, while our developers get involved in a lot of areas that are more traditionally seen as design. flexible teams are productive teams. No bottlenecks, waiting for the designer or the developer.
We use a lot of core principles of Agile Development. We work in short iterations, which means we meet with our client every week to show off completed features and prioritize what features to work on next. We also practice test-driven development.
Example of TDD code.
We don ’ t have dedicated project managers Each product team has a product lead. It ’ s this person ’ s responsibility to track hours worked and keep an eye on how the project ’ s budget is doing. Product teams work with clients to get the most out of the budget and make decisions about project scope--what features to include and which ones to drop, if necessary. Every week, we create a burn chart, which shows how much money has been spent vs how much of the project is completed. Teams print the burn charts and keep them visible in our workspace, and burn charts are delivered to the client every week, as well.
Product teams use a variety of tools to track tasks. This is a kanban board, used to track a feature ’ s progress through phases of initial sketching, visual design, markup, code, testing, and done.
At AO, we create software within a culture of validation. We validate user assumptions, IA and interactions, development builds, code, deployment, and completed products. We ’re constantly assessing, how well does our solution do the job? Like code testing, usability testing is one way we validate our software. This photo is an actual usability test session. Users were asked to complete a set of tasks in our software, and the software team watched from another room (on closed-circuit video feed) as the test unfolded.
On another project, we used paper prototypes to test how users interacted with a new device that our client was creating. We were specifically interested whether users were able to understand the icons we designed for each function. We didn ’t have the actual device available to use, so we created paper prototypes that mimicked the device’s behavior.