I think body language might be more important, and seems to me should be harder to detect. Eye's are pretty easy to detect because of the contrast between the whites and iris, and, of course they reflect light.
Detecting the difference between a hustle, a saunter, and a day dream might be a tad more difficult.
Gaze is actually more important when you're trying to figure out whether someone sees you. If I don't see your eyes, I would assume that you don't see me at all.
Something like this could be developed. For instance, the car could have an array of lights on its grill (think Knight Rider) that track in the direction of detected obstacles, if you don't see a light facing towards you then you know you are not detected and to use caution. Alternately a light on the front of the car could turn from yellow to blue to indicate its intent to yield.
A small screen (or more) might be a better option. It seems Mercedes made a concept version of the Smart with one, and the cars launched by Drive.ai already have them.
Detecting the difference between a hustle, a saunter, and a day dream might be a tad more difficult.