Inside the future meeting room
Inside the future meeting room
E&T on table-top-computing and other collaborative work space technologies from his state-of-the-art Meeting Room of the Future.
You know the situation: you're in a meeting and you need to check some facts only you didn't bring your laptop, and just one person can logon to the Wi-Fi so everyone is crowding around that screen gesticulating, trying to get their point across' It is this kind of problem that the Centre for Interaction Design at Edinburgh Napier University is working to overcome. In January 2009, the university funded a project to explore the impact of emerging technologies on how we live our lives and go about our daily activities in public, private and social spaces. This project is called Future Living Future Life and it seeks to deliver real-world practical benefits by designing, developing and implementing a series of 'future' spaces, the first of which is the Future Meeting Room.
So just what is a Future Meeting Room? Well, essentially it is a space designed to use the very cutting edge of current and emerging technologies to facilitate local and remote collaborative activities. In less academic terms, this means brainstorming, mind mapping, document or project review and creation - in fact, anything that requires people to work together in real time. As you might imagine, this covers a lot of things.
Interaction design is the process of understanding the whole shebang. Not simply what people say they want and need, but what they actually want and need - the two are often very different. Associated with this, are factors such as where they need to do it, how they currently do it, what it truly is and, critically, what problems are associated with that current practice. When a deep understanding of a specific practice emerges, so too can ways of making it more intuitive, more effortless, more productive, more efficient, even more fun, whatever is appropriate.
An example of this is the iPhone. The critical factor was the design philosophy, a bottom-up approach that questioned the traditional paradigms of how computing tasks are achieved and, more importantly, how they should be achieved. Central to this was the notion of developing an entirely touch-based operating system considered for the context of use rather than repurposing desktop computing metaphors to fit into scenarios for which it was ill-suited, in this case, mobile computing.
In the iPhone, the user interface has been developed to serve the context - it applies a blank slate approach where the interface adapts to the specific task in hand. It is appropriate, effortless and hence it has become the benchmark for intuitive computing devices against which all others are measured. Both the iPhone and iPad provide a framework, an approach on which developers can offer specific functionality through individual apps, but which at its heart is a philosophy of removing the mediation of traditional desktop interfaces, driven by a mouse and keyboard, and the user interface paradigms associated with them, and instead delivering an experience driven by context and fitting around the user.
It is with a similar philosophy that we have designed and built our Future Meeting Room. Core to the design process has been the belief that surfaces should be computationally capable, by which I mean they should be capable of receiving and outputting digital information. Furthermore, we aim to remove function from the contents of discrete objects (screens, laptops, mobile devices) and instead consider them as portals to function and content, enabling and facilitating real-time, concurrent remote collaboration.
Removing the mediation of the traditional desktop-computing paradigm frees developers and designers to work towards the next generation of user interfaces. The space is intended to be a blank slate where the technology doesn't drive the applications, but the needs, wants and activities of the people using the space do.
We feel collaborative environments must allow the free flow of ideas between people while maintaining a browsable history and formalisation of any process that takes place. Constraining people to one-at-a-time interactions in a crippled space has no place in the 21st century. Be it a door, a wall or a tabletop - all aspects of an environment should help the people within it fulfil their activities and do so in pleasurable, intuitive ways.
The Future Meeting Room was opened for use at the end of January 2010 and promptly received a name change. To call a room 'Future' seemed rather odd and so the space was christened the Interactive Collaborative Environment (ICE). The name is intended to reflect the activity that the space facilitates, namely interactive collaboration. As mentioned earlier, all the surfaces in the room are computationally capable but deploy that capability in different ways as dictated by people's expectations, physical practicalities and contexts of use. Conceptually, the room's interactive surfaces can be thought of as six very large iPhones.
Consider traditional interfaces: they have a top and bottom and consequently a left and right, i.e. they have an orientation. This makes sense as much of the content displayed on any interface also has an orientation, for example, a text document. This is fine when everyone who is looking at the interface is in the same orientation, as is the case for wall-mounted displays, but falls down when viewed from multiple orientations, for example, from around a tabletop. Suddenly, one user's up is another's down. This means completely rethinking the menuing and windowing paradigms of traditional, orientation-fixed interfaces. For example, to open the application launcher on the ICE-desk, a user draws a capital 'N' gesture, from which the system can determine his orientation; it can then open the launcher so that it is correctly orientated to the gesture and thus the user. Someone making the same gesture on the other side of the table would generate a launcher which opens at an orientation 180 degrees to the initial user, but is correctly orientated to themselves.
Any artefact on the ICE-desk can be freely rotated and scaled, once initiated, but throughout any activity on the table, be it mind-mapping, outlining a proposal, examining the blueprints and CAD models of a building or browsing through a design portfolio, the requirements of a non-orientated interface must always be considered.
A further challenge will be the development of universal gestures that work across devices, operating systems and, more critically, the cultural differences of global users. Ingrained western metaphors, such as left to right meaning start to finish, are based on cultural artefacts, like western languages. They will have deep impact on intuition and affordance and are not necessarily globally accepted.
Conceptually and architecturally, the wall screens and ICE-desk are not considered computers or even computer screens, rather they are six zones of interaction which form interchangeable interactive windows onto digital data. This is an important separation as it means users can break free from the notion of local data or even local interactions and instead consider the ICE as a particularly well-endowed facility for collaboration. To participate remotely, all anyone needs is an iInternet connection, a Web browser and, ideally, a webcam. They can then enter a URL, connect to the ICE, with zero downloads required, and participate in full video conferencing as well as screen sharing and collaboration. Six remote sources can simultaneously video conference with one another and the ICE.
We think it is the financial and environmental savings such remote collaboration can bring about to any organisation that will be the driving force behind the adoption of these technologies. Or, indeed, when remote collaboration is the only alternative, a recent example being the volcanic ash flight ban period. This is an example of the concept of effortless computing the ICE is trying to foster. Using open standards and remaining hardware and software agnostics means the technology is fitting around the users rather than forcing it upon them.
Replicability and effortlessness
The ICE is hoped to be several things. First and foremost, it is a functioning real-world meeting room available to faculty members, students and a hireable resource for industry. Secondly, it is a platform for interdisciplinary research across the university exploring network security, embedded computing, environmental monitoring, information visualisation and user interface design. Lastly, and perhaps most importantly (as a mission statement for the future), it is that the notion of computing is being separated from a box and instead should be considered an experience, that digital technology should augment our activities, not supersede them, and that it is what people do that is important, not the technology that helps them do it.
Inside the Future Meeting Room - The Desk
The room's centerpiece is the bespoke ICE desk, which forms its hub. It is a 104in n-point multi-touch-capable device. It currently runs with zero slow-down with 80 simultaneous touches, which means eight people using every finger of both hands. What multi-touch offers is a remarkably natural way for people to interact with digital artefacts.
The ICE-desk is, in essence, a light tight box flooded with infrared (IR) light. The IR light is shone at the interactive surface, which has a clear acrylic top with a diffusing layer above. When an object touches this, it reflects more light than the diffuser or objects in the background. These blobs of light, detected by four IR cameras and their X-Y coordinates over time, form a Tangible User Interface Object (TUIO).
The TUIO protocol allows the transmission of an abstract description of interactive surfaces, including touch events and tangible object states. This protocol encodes control data from a tracker application and sends it to any client application capable of decoding the protocol.
There are five wall screens of two types, each capable of multi-touch interaction. On the central wall is the 46in multi-touch cell, which creates the image using a high-definition (HD) LCD rather than rear-projection. The other four screens are 42in HD LCDs which use an IR overlay to generate the multi-touch data. IR LEDs create a thin skin of IR light across the surface, any point of contact on the screen can then be co-ordinated by cameras in the corners of the overlay.
Each screen is driven by its own computer. The wall screens boot into Windows 7 as this is currently the only operating system to use multi-touch gestures, such as pinch to scale and two fingers to rotate, across any system application.
|To start a discussion topic about this article, please log in or register.|
"Africa is abundant with engineering opportunity. We look at some of the projects and the problems."
- Greenpeace frowns at Centrica's getting a shale-gas venture stake
- HMS Queen Elizabeth nears completion
- World’s most advanced comms satellite shipped to launch site
- Scientist to benefit from exascale supercomputer deal
- Chinese space capsule reaches its ‘Heavenly Palace’
- Dinosaurs’ app uses augmented reality
- E&T magazine - Debate - HS2, the need for speed [01:33 pm 18/06/13]
- Creating an Iphone App [05:50 pm 17/06/13]
- CO2 is good [07:29 pm 16/06/13]
- DECC-EDF makes yet another attempt to fund 3rd Generation Nuclear at any cost [05:02 pm 15/06/13]
- Transformers Vector Group [09:46 am 15/06/13]
Tune into our latest podcast