Thursday 25 October 2012

Web Design Issues

Web Design Issues
Understandability
Recently, a study was carried out on how users read webpages and the results brought back that simply, they don’t. Instead of reading a webpage the average user (80% of people), scan a webpage the first time they visit it. When arriving at a webpage for the first time it is imperative that the designer of the page has made a good first impression, whoever the audience may be. This is why a webpage when first viewed must be easily understandable to read with some predicament from the user on how the content should operate/appear next (consistency).

This is why web designers must employ scannable text with features such as highlighted keywords to give a broad sense of the content, as well as one subject per a paragraph to keep the body varied and not ‘waffled’. Furthermore, users don’t like reading whole walls of text so keeping the main points structured apart and using variety with lists is always better than losing your place with too much content. 

To the right is a key example of a poorly designed website that has a low standard for first impressions in understanding and functionality. Please read the annotations.

Consistency
Earlier it was mentioned that users find content easily understandable when they can predict on how content should behave or appear next, this can be referred to as consistency. Consistency is a very powerful tool and is always most effective when actions based on previous experiences and new features are balanced together in moderation. When a user spends a lot of time on a website they grow accustomed to the layout of it, they hate dramatic changes. This means that over time a frequent user of a site forms expectations on how the site should act based on their experience there and on other websites too. If deviation were to occur, such as a completely new interface or a dominating feature, users may leave the website altogether, as it can be harder to use.

Many of popular websites update their interface frequently either to improve performance in the long term or to refresh the look of their business to new audiences. Take the example of Facebook or Youtube, everyday millions of people use these sites for social media and connection and it’s every 6 months or so the layout changes. Near just 1 week ago Youtube changed the layout of their site, but is it for the best? Sure users will grumble at the beginning about their favourite sites being unrecognisable, but will the new features (such as Google/Youtube integration) be more beneficial in the long run? Should Youtube be more open about Beta testing layouts? Instead of just flinging new designs onto users and expecting them to follow through with it, especially as thousands of people (partners) depend on the site for a livelihood.

Above is a short video by Vincent Flanders showing a website that has an inconsistent navigation in terms of font colour, text decoration and lack of content within webpages.
 

Cross Browser Compliance: Vendor Prefixes
As CSS develops onwards certain Browsers are finding it harder to support sophisticated presentation features, such as CSS3’s box-shadow, and are facing the issue of multiple vendor prefixes. Vendor prefixes are tags that go in front of complex CSS property values, they’re needed in some cases otherwise the certain feature may not be fully functional (visually) on a website. The key issue is that in order for the declaration to work the CSS must cater for every browser that might access the code, therefore the same property values have to be typed out but with each one having different prefixes such as –moz- (Mozilla Firefox) or –webkit- (Chrome and Safari). However though, most coders don’t want to write the same property value over and over again, in most cases it’s time consuming and can be a lot more efficient just to collapse all the prefixes into one single prefix –beta-

Although, in broader terms we can just say that there are too many differences between browsers when rendering CSS code. Take the example of the box-shadow function, both Mozilla and Webkit Browsers render this action differently to each other as they both handle the shadow blurring uniquely. Perhaps instead of thousands and thousands of developers putting in repetitive code, implementations can be brought in to hopefully achieve interoperability between browser companies. If browsers didn’t achieve this then it may get to the case where certain websites will suggest which browsers their websites would look best in, or even more dramatically have CSS pick which browser gets a certain feature and which doesn’t.
If these sorts of prospects were brought into today’s market for designing websites, then whole layouts could completely look messy and fall apart in one browser compared to another. Perhaps the striving towards a unification of browser behaviour or another method such as the –beta- prefix may fix some of the grumblings of browser creators and web developers.

The Search Function
Search is another key feature that is fundamental to nearly any website with a user database or a lot of information that may seem unnecessary to certain visitors. Whether it’s shopping for food or checking on a certain user you used to socialise over on a forum, if your website doesn’t have a broad-scope search with easy to use functions, your website will suffer. Given that typical users are quite poor at multiple query attempts, if they don’t get good results on the first or second try, an attempt at using advanced features or trying other words may seem too far-fetched to the masses of website surfers. Especially as most queries have a success rate of around 50% (on most e-commerce sites), you’d want your search function not to be too specific but neither it be not specific enough.

If a website’s search function were overly literal then cases such as typos, plurals, hyphens, etc wouldn’t be accepted as relative to what you’re searching for. More specifically, if I searched for holidays in Hawaii but kept on spelling it “Hawaai”, I wouldn’t receive any results that may be useful to me. In terms of audience as well, overly literal search functions are bad for elderly people and young children as they’re more prone to make mistakes. Another issue of a poor search function is when the engine prioritises results purely based on how many query terms they contain, instead of whole documents or webpages.

Most of the time it’s best for a website to go for a simple search as a small box is what the majority of users expect in terms of interface, especially as search serves as a alternative to when a navigation may fail.



No comments:

Post a Comment