Not that long ago Gizmodo posted an article around one of its journalists attempting to last a week using a Dell Venue 8 (an 8" Windows 8 Tablet) as their primary device. |success? Sort of....
But the possibility of using your talet as your main computing device is becoming more of a reality with a scattering of high end tablets, wireless connecivity options and the ubiquitious OTG (let's you use your micro USB charging port with any standard USB device).
Admittedly some of these devices are rather expensive when compared to a similarly specified laptop/notebook/ultrabook (thehigher end ones such as the Lenovo Yoga Pro 3) but these can be had for a much cheaper price point (Teclast X series). Some even offer Dual boot capabilities in Windows 8 and Android.
Armed with a tablet with Bluetooth, WiFi and WiDi (Wireless Display) you don't even need any cables to get yourself setup for a tablet centred desktop PC (Masking tape maybe to stick it to the back of your monitor??).
And combine it with a bluetooth keyboard case/cover and you've not only got a touch tablet but a fully functional laptop to go.
abilIT
Sunday, 19 July 2015
Friday, 13 February 2015
Time to go Tactile
It's no secret that I've often lamented the loss of tactile interaction caused by the use and continuing popularity of Touch Screens. Touch screens have of course proven useful and particularly versatile as they have the ability to modify your interaction technique depending on what's displayed. But this versatility came with the trade-off that you had no way to physically provide a response to the user that a button had been pressed or a knob was turned common occurences when we interact with the day-to-day environment of our world.
This challenge has long been a "dream goal" of many hardware and software developers and generally speaking, the domain of Science Fiction. But not anymore, the ability to actively permeate a surface has arrived and no longer just in development (it's been actively under development from many fronts since some of the first touch screens became commercially used).
Tactus Technology has now introduced its new Phorm cover for the iPad mini. Granted, it's not the end product or even an integrated solution but it's a very promising start. No longer will you need to fumble around and actively seek keys with your vision (a common issue with most current touch screen interfaces that often require a "Search and peck" approach to typing).
This challenge has long been a "dream goal" of many hardware and software developers and generally speaking, the domain of Science Fiction. But not anymore, the ability to actively permeate a surface has arrived and no longer just in development (it's been actively under development from many fronts since some of the first touch screens became commercially used).
Tactus Technology has now introduced its new Phorm cover for the iPad mini. Granted, it's not the end product or even an integrated solution but it's a very promising start. No longer will you need to fumble around and actively seek keys with your vision (a common issue with most current touch screen interfaces that often require a "Search and peck" approach to typing).
Monday, 9 June 2014
Haptics
HAPTICS
Technology and Touch
Overview
- What is Haptics?- Haptic Communication
- Harnessing Haptics
- Technology and Design
- Where's the Button?
- Haptic Revolution
- Touching the Future
What is Haptics?
While Haptics has been studied for some time with some of the earliest recorded research occurring in the early 1900's, it is a relatively new topic area that is highly multidisciplinary. Described by Wikipedia as "Any form of non-verbal communication involving the sense of touch" and by Dictionary.com as "To Touch". Haptics forms part of our everyday life and is part of the current multi-modal concepts of communication which also includes aural and visual cues. It is particularly common in environments where two or more people share the same physical space.The way in which we communicate via touch is often reliant on culture. For example, a traditional Western style greeting is the handshake (particularly between males). Job interviewers still rely heavily on this basic contact to determine overal factors about a job candidate such as confidence.
Haptic Communication
The Deafblind Community has relied heavily on Haptics as part of its formal communication strategy. While Deafblindness severity varies greatly and thus, communication methods have high levels of variation, there are many formal methods that almost are entirely based on interpersonal touch (touch occurring between two individuals). Touch is also used by Deafblind individuals to explore their environment and receive either direct or indirect transfer of other communication modalities (eg speech -> Tactile).This heavy reliance on Haptics by the community on a daily basis makes it an expert in an emerging field that still has a long way to go in fully understanding the extent of Haptic capabilities.
Some examples of formal communication methods occurring within the Deafblind Community include Deafblind Tactile FIngerspelling (Australia and UK), Tactile Auslan or Hand Over Hand (Australia), Braille (Internationally), LORM (Germany) and Social Haptics (Europe). But while these formal methodologies exist it is also important to keep in mind that Haptics communication is a part of our everyday life and occurs, regardless of our formal methods, age, culture and other influencing factors.
Harnessing Haptics
The Human Body is hot wired for Haptic reception. But this wiring is not evenly distributed with Haptic resolution varying greatly between surface areas on the body. While the back is relatively low resolution, the hands and particularly the fingertips are extremely high. Ever wonder why babies place things in their mouth constantly? The human mouth is one of the most sensitive locations but not to worry, we're not about to start putting electrodes in there.....Two important things to remember about the type of touch our body is primarily geared to receive. Rate of change (ie movement) and differentiation, much in the same way our eyesight works (we are particularly perceptible to motion and large differences/high contrast).
So what can we feel? There are lots of different things we can feel/sense through our touch receptors : Temperature - Hot or cold?
Moisture - Dry or wet?
Sound/Vibration - What is the amplitude and frequency?
Shape : is it rounded, flat or pointy?
Size - How large is the object?
Orientation - Which way is the object facing?
Weight - Is it light or heavy?
Tension - Is it loose or high strung?
Some interesting factors to note are that if we place two temperature based inputs close enough together, our skin averages out the temperature and does not treat them as discrete inputs (eg Hot + Cold equals Warm). Over the years there have been devices and techniques that have attempted to directly transfer speech into touch (Tadoma, Vocoder, Tactaid) but have had little in the way of mainstream uptake. One possible reason for this is that the important frequencies with which speech occurs is often outside the range of frequencies the human skin is attuned to (generally thought to be 30 - 300 Hz).
But we are also capable of detecting and determining compound differences. Texture is a very good example of this (as is Emotional intent). Texture comprises many variables including Temperature, pattern (uniform or irregular), friction (rough or smooth), hardness and many other factors.
Technology and Design
Over the years, technology has changed greatly and its focus in the design phase has also changed with the technology. Haptics has become an important part of the design process but its less about Haptics being a trade off of technological requirement and more about actively responding to the user in a Haptic modality. Three factors of older technological design were form equals function, technological and material constraints.As an example, let's take the old style Rotary Dial Phone.
In this case, both the form equals function and technology constraints are fairly similar and intertwined. For example we need a handset to ensure that the microphone and speaker are close to the ear and mouth. The rotary dial mechanism is required to generate the correct dial tones. Those familiar with older style rotary phones might remember the inbuilt Haptic components of the rotary dial (it would stop when you reached the end). Materials in use at the time were timber and brass which were fairly commonly available and easily worked with. The had an impact on how the phone felt to use as well.
What about today's technology though? These days we rely heavily on multi-functional devices so we've almost completely lost the rule of form equals function. Technological constraints are now a bit backwards and more determined by the question of the form factor (the predetermined size and weight constraints) of the device. Materials constraints are somewhat present but is less restrictive with access to greater range of easily manipulated materials.
Let's take a look at a tablet as an example of a modern technological device.
Except for processor constraints in the area of technology, neither form equals function or technology constraints are really at play. This has almost become entirely about "What size do we want it to be?" and then cramming as much technology in as we possibly can (eg high resolution touch screen, speakers, microphone, accelerometer, gyroscope, GPS etc). Here, materials are a more interesting aspect of design for a Haptics enthusiast. This is especially true with current ranges where it's all become about how the user perceives the device when they touch it. Good examples of materials use are the Galaxy NotePro devices that come with an artificial leather back designed to provide the sensation of high quality materials use while providing high levels of friction for non-slip use. Apple also sticks with its aluminium Unibody design which is known to produce a sensation of good quality.
But humans are still slaves to preconceived conceptions of quality in terms of Haptics. This is evident in DVD/Blu Ray Player manufacturers placing weights in their machines to generate the sensation of quality components (good AV equipment often was oft heavier). Our perception is also linked to other inputs such as Vision and our preconceived conceptions of how an object should behave (eg a washing machine dial that does not "click" on each setting may cause a sensory mismatch).
Where's the button?
In 2007 we saw a fundamental shift in the popularity of touch screen devices with the release of Apple's original iPhone. While not technologically spectacular (no SMS and often criticised for very poor hardware components) it did bring the concept of the touch screen to the masses. Previous devices were already available (HTC HD, Palm etc) but hadn't enjoyed such levels of popularity.With this popularity of such a flexible device we also saw a change in how things such as the Disability Technology sector responded. No longer did you need a special device to do Video magnification, Document Scanning and OCR, Book Reading and many other things. You simply needed the right App, all on the one device that fit in your pocket. But smart phones tend to still remain a bit of a "Jack of all trades" type device with problems such as no stand for long use of a magnifying App, camera is not ideal for close focal points and the microphone is not well suited to speech recognition in noisy environments (although they are improving).
The good part about the use of a touch screen device was that it offered much greater portability. A (hopefully) simple way to access many input and output functionality because we only needed to render relative controls on the screen at any given time (eg play button to start music). Responsiveness has also improved with computing capabilities and this has also allowed us to start using much more complicated functionality (eg speech -> text). All things considered, the touch screen has offered the disability community a far greater level of Independence it has not previously enjoyed.
But as always, there is bad that comes with the good. In particular and especially relevant to people with a vision impairment are that touch screens are still relatively vision intensive (braille displays can be linked but this detracts from portability). Sometimes there are issues with glare (in outdoor environments) and App developers vary wildly in their rendering of common functions and design strategies (there is no standard followed for software design and interface layout). Furthermore, current Operating System trends have a heavy reliance on text and are still not particularly finger-friendly.
So are touch screen devices better or worse?
Depends entirely on the application of the touch screen and how its interactive is implemented. A good example is turning the page of your book by swiping the screen. But they're not always appropriate for the use and environment in which they're aimed. Do you really want to be trying to see your car's climate control system on a touch screen in a glare car while trying to drive? Not likely...
Haptics Revolution
So your phone vibrates as do most tablets now. That's the very tip of the iceberg though and is accomplished with a very simple DC motor placed in direct contact with your devices outer edge (often only being able to change the duration and frequency of the motor).Active Haptic feedback is becoming more and more a possibility with large company investigation (it's the next frontier). Size and expense constraints are becoming less also and there are plenty of developments over the horizon. Some that relate to touch screen devices are being able to place active force on a naked finger, active permeation of the screen height (ie raising and lowering parts of the screen), phone bumpers with multi-actuator feedback to provide haptic GPS direction and many others.
But let's get back to Haptic Communication.....
Several Deafblind related gloves are currently starting to reach introduction stage. One of these is the LORM Data glove so let's have a look at what it can do. The LORM glove converts text messages on a mobile phone to the LORM alphabet (and vice versa) using a series of vibrotactile actuators for generating vibrations and sensors to determine active touch on the glove. It's important to note here that the LORM alphabet is generated by a series of taps and swipes on the fingers and palm of the hand.
This is a great step in the right direction but there are limitations and possible problems. One key limitation is that the actuators used, while small, are only either on or off which produces a very "flat" communication method without emotional content. Its also very directed and single purpose and its in contact with the hand constantly meaning it may suffer from desensitisation over time (not to mention wearing out of materials through continual removal and contact).
Touch the future
Which is where technological advancement such as the Haptuator Mark II (TctileLabs) come into play. Relatively cheap, small and easy to control the Haptuator Mar II has greater flexibility. It's not just a matter of being on or off but controlling the actuators amplitude, frequency and duration with a quick response time.With actuators along these lines we can generate different types of "contact" and not just the general tap/swipe that we are used to. Some things that we can achieve are the Tap sensation (High Frequency, High Amplitude and short duration), rotation (propeller noise) and directional blunt motion (white noise, a quick swipe can be generated by simply activating two actuators that are fairly far apart in quick succession).
This allows us to think about much more complicated and intricate forms of communication. We can even look at transferring emotions music and visual content.
Soon the Deafblind community could be mobile calls using Haptics technology.
Sunday, 23 March 2014
Mobile Human Computer Interfaces
In terms of Human Comuter Interfaces (HCI) we've come to expect a fairly standard set of options or in other words, the status quo. But do they really meet our needs and is the concept of the smartphone operating system really being appropraitely considered from a finger friendly perspective? Not really, at least not in most cases.
I am sure some punters will laugh in my face but despite WIndows Phone 8 essentially being an Ugly Duckling of the current crowd, it is one of the better implementations of a User Interface that is specifically designed for use with your fingers. It's generally large tiles (although you can usually select several sizes) allows for easy pressing of desired applications.
Ok, I hear you cry out "But what about iOS???!!" and the other crowd who undoubtedly will argue that Android is much better....
It's undeniable that both operating systems are now a pretty sight and Android in particular, allows for a large amount of flexibility in changing the appearance to suit your needs. But heavy use of text and poor in App design are often a problem. Items such as the back button on the new iOS 7 Safari leaves a lot to be desired Android doesn't do much better with heavy use of text in its interface that are often difficult to identify as buttons unless you have previous experience (the Save and Cancel buttons up the top right common to most apps is terrible).
Some basic principles of design could really help improve the useability of our finger oriented devices. Steve Jobs is famously quoted as saying they would not produce a smaller tablet unless you could shrink your fingers. Well our fingers haven't shrunk but the Operating Systems haven't provided us with bigger buttons to press either, regardless of screen real estate.
I am sure some punters will laugh in my face but despite WIndows Phone 8 essentially being an Ugly Duckling of the current crowd, it is one of the better implementations of a User Interface that is specifically designed for use with your fingers. It's generally large tiles (although you can usually select several sizes) allows for easy pressing of desired applications.
Ok, I hear you cry out "But what about iOS???!!" and the other crowd who undoubtedly will argue that Android is much better....
It's undeniable that both operating systems are now a pretty sight and Android in particular, allows for a large amount of flexibility in changing the appearance to suit your needs. But heavy use of text and poor in App design are often a problem. Items such as the back button on the new iOS 7 Safari leaves a lot to be desired Android doesn't do much better with heavy use of text in its interface that are often difficult to identify as buttons unless you have previous experience (the Save and Cancel buttons up the top right common to most apps is terrible).
Some basic principles of design could really help improve the useability of our finger oriented devices. Steve Jobs is famously quoted as saying they would not produce a smaller tablet unless you could shrink your fingers. Well our fingers haven't shrunk but the Operating Systems haven't provided us with bigger buttons to press either, regardless of screen real estate.
Friday, 7 March 2014
Apples and Lemons
I was helping a friend recently at an Apple store. The problem? An iPad mini that reports the headphones are always plugged in even when they're not and won't play sound through the speaker. This has been an ongoing issue for 6 months which I attempted to fix by updating the iPad to the newer iOS 7....
The problem itself started after plugging in a set of headphones into the headphone jack on the iPad which were then removed an hour later and this issue has occurred ever since. Headphones were purchased through the Apple store, were plugged in by a sighted person, used for this period of time to listen to music by a child and then removed again by the adult. We finally find out now that the issue is that part of the headphone jack has snapped off in the socket.
And you guessed it...
That VOIDS the warranty. So it would be $250 to repair (outrageous).
The basic story from Apple is that this issue has been caused by a third party product (even though Apple sold this product) and therefore it is not their responsibility, they will not even follow up with the manufacturer of the Headphones to determine if there is an issue that is known by the company about the Headphone jacks being faulty.
This makes me wonder how Blind people feel when they are required to plug cords, cables and all other manner of items into their i-devices? Especially when a lot of people in the Disability sector are using non-standard equipment, or at least equipment that Apple does not even remotely endorse. I guess we all must line up now to get someone "Officially Endorsed" by Apple to plug our equipment in, just in case.....
The problem itself started after plugging in a set of headphones into the headphone jack on the iPad which were then removed an hour later and this issue has occurred ever since. Headphones were purchased through the Apple store, were plugged in by a sighted person, used for this period of time to listen to music by a child and then removed again by the adult. We finally find out now that the issue is that part of the headphone jack has snapped off in the socket.
And you guessed it...
That VOIDS the warranty. So it would be $250 to repair (outrageous).
The basic story from Apple is that this issue has been caused by a third party product (even though Apple sold this product) and therefore it is not their responsibility, they will not even follow up with the manufacturer of the Headphones to determine if there is an issue that is known by the company about the Headphone jacks being faulty.
This makes me wonder how Blind people feel when they are required to plug cords, cables and all other manner of items into their i-devices? Especially when a lot of people in the Disability sector are using non-standard equipment, or at least equipment that Apple does not even remotely endorse. I guess we all must line up now to get someone "Officially Endorsed" by Apple to plug our equipment in, just in case.....
Saturday, 28 September 2013
Mobile Screen Magnifier Round Up
Android Screen Magnifier
As the newcomer to this particular functionality (at least to my knowledge) Android has some expectations to live up to and a fairly high bar to match. There are some key and interesting differences between this implementation and the one present on WIndows and iOS based devices even though at first glance, it offers the same feature set.
Unlike iOS and Windows Phone the Android based magnifier does not zoom in on the keyboard. What? Yes that's right it doesn't zoom in on the on-screen keyboard. This may sound odd at first but if you've ever attempted to type while the screen is magnified you can probably recognise that this actual feature comes in handy. This one works by retaining the screen itself still being magnified (in particularly the text you are entering) while slotting the standard keyboard at the bottom of the screen.
The other odd feature is that when you get a notification in a dialog box the screen automatically disengages the magnifier. This might seem unusual to start with as well but ends up working favourably as it stops you from remaining oblivious when there is a change in focus in the application which can sometimes occur in iOS (where you might be zoomed in somewhere and not aware of an item requiring your attention somewhere else on the screen).
Apart from this, all functionality is essentially identical to those in Windows Phone and iOS based devices.
iOS Zoom
The iOS Zoom has been around since the first iPad and offers the same functionality as that available on the Windows Phone and Android based devices. It's key difference here is that it utilises three fingers to implement its functionality. This can be both good and bad as it's distinct and you won't be setting it off accidentally but it can sometimes be hard to pull off without accidentally clicking on an icon or something you didn't intend to.
Perhaps the most useful distinction here is that the iOS Zoom still retains its kinetic scrolling and flick functionality. This is handy for reading long lines of text or if you want to quickly go from one side of the screen to the other without zooming out.
WP8 Screen Magnifier
There's nothing really here to set it apart. About the only thing I can mention here is that the combination of the layout design, capability to enlarge the font sizes in all aspects of the OS and the implenentation of a basic, but well executed magnifier works well.
So which one's for me?
In my view, Android has the edge on the other mobile OS' here. The way it implements its magnifier is a much more readily useable system that ensures you stay aware of your full screen content while allowing you to zoom in on what's important
Wednesday, 25 September 2013
iOS7 : Designed to Brill
Apple, iOS and it's iDevice range have long since been renowned for their ease of use and in particular, the finger friendly approach to its User Interface Design. Competing mobile Operating Ystems have long been criticised for lack of optimisation on devices that are primarily for use with your fingers.
Enter iOS7.....
Apart from it's tendancy to have a bright first impression *often with an ongoing after image!) it is more concerning to note some of the more subtle changes in the User Interface that reduces its useability, especially from the perspective a vision impaired user.
Essentially Apple has dumped it's colour coded button system, perhaps most prominent within its own Apps and the Settings application. Gone are the red cancel buttons and the blue Next/Done button. Replaced by a small text link that is often hard to located. This trend is continued in Mail with a white on white on white approach.
Granted, it looks a little more refined (although some claim it looks "childish") it's an odd step for Apple to take.
Unfortunately as someone with a Vision Impairment I cannot seriously consider recommending an iPad/iPhone anymore. Which is a shame, as Apple has long since been a leader in the accessibility arena.
Enter iOS7.....
Apart from it's tendancy to have a bright first impression *often with an ongoing after image!) it is more concerning to note some of the more subtle changes in the User Interface that reduces its useability, especially from the perspective a vision impaired user.
Essentially Apple has dumped it's colour coded button system, perhaps most prominent within its own Apps and the Settings application. Gone are the red cancel buttons and the blue Next/Done button. Replaced by a small text link that is often hard to located. This trend is continued in Mail with a white on white on white approach.
Granted, it looks a little more refined (although some claim it looks "childish") it's an odd step for Apple to take.
Unfortunately as someone with a Vision Impairment I cannot seriously consider recommending an iPad/iPhone anymore. Which is a shame, as Apple has long since been a leader in the accessibility arena.
Subscribe to:
Posts (Atom)


