Natural interactions for medical image analysis

Interaction-Design Master Thesis

Posts Tagged ‘interface

Nexight (Final Design)

with 2 comments

I created a page that explains the final design of this projects. There you can find all the final videos and explanations in one place. I called the system I’ve designed Nexight:

Nexight (Final Design, descriptions, resources, links and videos)

There is also the final animation of the user interface (you should check out the HighDefinition version to see all details of the interface)


Written by Jannes

May 31, 2009 at 17:17

Touch Concept Video Raw

leave a comment »

Here is the first raw version of the full Touchscreen Concept for this project. I rendered it yesterday night, and there is still some quirks, missing screens and no explanation at all. But it was really important for myself to see the whole thing together. It made me realize that there are still small problems and inconsistencies in the interface. Not sure if I am going to be able to change everything, because there is only little time left. Still I think it conveys the concept and idea very well, and that’s the most important.

its pretty hard to see the details of the interface in the low resolution version here on the web. I guess I have to make an extra web-version in the end. also Vimeo somehow stretched the video image. (…a lot of things to fix)

Written by Jannes

May 11, 2009 at 12:41

Graphical User Interface

leave a comment »

All my scenarios and prototypes until now had now common graphic design at all.  I didn’t want to spent to much time on the graphical user interface (for now). But I think its time to give the project a nice visual touch.

I think VJing interfaces can be a great source of inspiration, because they are specialized in working under dark (and difficult) light conditions. Its a similar environment as in the radiologists reading room, in which the light is dimmed to enable a better viewing of the images. Also for VJing and radiology both the visual output is the most important.

Lemur is a multitouch VJing tool with a very bright and clean graphical interface.

Another source of inspiration is data visualization and aesthetics. Here the aim is to visualize data in a attracting way and more importantly make the meaning behind the data easy understandable. If you think about radiology in this way, it’s less about viewing xray images, but about making the (body)data understanable to the radiologist, physicians, surgeons, patients…

imformation aesthetics and visual complexity are great collections of visualizing data (and tow of my all-time favourite blogs)

An interesting trend I see in multitouch interfaces is the lack of graphical user interfaces at all. If the interaction with the content is so natural and easy to understand, we might not need buttons, frames, sliders, icons, tools, mousepointers etc…

I want to create a clean interface that is not distracting from the images/data, but still enables a easy interaction. The interface also should support the reading of the data and the understanding of relationships between the different sources of data.


Written by Jannes

April 10, 2009 at 08:46

Posted in Phase 04 - Design

Tagged with , , ,

physical cube to view/manipulate a 3D model – v0.2

leave a comment »

In my last meeting with my mentor Mine we discussed the idea of the physical-cube-interface. Mine had a good point by mentioning the difference between looking at a 3D model on your own to the situation in a video conference. I think the augmented reality approach (in which you see the actual video stream with your own hands and the cube, augmented with the digital 3D model) is great for discussing a case online in a video conference, but doesn’t make much sense while analyzing a case on your own. If you are on your own, you don’t really need to see your own hands on the screen.

So I think there should be 2 concepts – one for working alone and another one for discussing and showing a case online.

In the second version of my physical-cube-interface prototype you can not see the video stream. I tried to make the tracking more stable and also implemented some of my initial ideas for viewing actual radiology data. This is still a very early prototype but I have the feeling that I am on the right track here. While playing around with the prototype I become more familiar with the advantages and problems. It already feels really natural to view a 3D model. But I think the other ideas I’ve implemented (pressure sensor, move cube left/right to change transparency) need some more consideration. At this stage there is too much things happening at the same time.

Written by Jannes

March 10, 2009 at 10:41

Tacticle “Displays” ?

leave a comment »

What does a radiologist do to locate and identify a tumor? He is looking at pictures and compares them to find known patterns or shapes in the images that are related to the tumor. But there is a lot more attributes to a tumor that would identify it. In mammography it is the first thing to do, touch the breast and get tacticle feedback about lumps. What if there were a sophisticated “tactile display”? So we could actually feel tissue with our hands to look for patterns/shapes. Hands are super sensitive, try to think how easy it is to feel a grain of sand or hair. It could give us additional feedback about softness/hardness, texture, surface, shape, composition…

This got me starting to think about other features that are not considered in radiology yet:
color, temperature and smell of tissue.


picture resources:
Foam Display
Nasa – Touch the invisble sky

further reading:
Wikipedia – Braille Display (display for the blind)

Written by Jannes

March 6, 2009 at 11:15

physical cube to view/manipulate a 3D model

leave a comment »

I think 3D models of body scans could be the future, or at least take a bigger role than it is today. Introducing a natural interface for viewing these models could help the acceptance amoung radiology professionals. Here is another rough sketch around this idea. It’s an actual working prototype… unfortunately it’s very flickery, but I think it still shows the basic idea of the concept. manipulating a physical object and have a digital model exactly the same way.

Written by Jannes

March 4, 2009 at 22:22

Microsoft Surface for medical applications

leave a comment »

I found two medical related projects for the Microsoft Surface table.

This one was shown at CES 2009. It’s a tool for medical education. It seems like a lot of revolutionary projects in the medical field are done for education (see my post about HeartWorks and the VisualBody). I guess its because real medical applications have to follow a lot of restrictions and need to be carefully tested.

Like HeartWorks, this project also utilizes tangible interactive objects. In this case its a physical model of a brain that can be placed on the table to enrich the brain model with digital data. Of course they use multitouch, all kinds of online communications and information from the cloud.  Also they show a flexible display. That’s cool, but doens’t really make sense.

VitraView by InterKnowlogy is the second medical project for Microsoft Surface I’ve found.

Read the rest of this entry »

Written by Jannes

January 30, 2009 at 16:41