Doctoral dissertation: Evaluation of user experience in mobile advertising

The proposed research fits into the scientific field of graphics and information technology, more specifically to the field of interactive media and mobile advertising.

TL'DR: The idea is to effectively measure the impact user interface has on performance, user engagement and user experience. The focus is on enclosed user experience in mobile content, such as ads.

Keywords:

mobile advertising rich content user interfaces user engagement user experience marketing


Problem description

The number of mobile devices, such as smartphones, tablets, wearables, e-readers are on the rise, and so are their capabilities (compass, camera, gyroscope etc.), enabling richer activities and interactions.[9] As a result, we have encountered a shift in ergonomics of software design and have been introduced the new ways about how human interact with devices using variety of gestures (swipe, tap, drag, pinch, shake, etc.), multi-touch inputs and also using voice as an input. The challenge in to design the most suitable, intuitive user interface and interaction flow for the specific task.

There is no clear methodology or known techniques for comparison and measurement of performance of different user interface elements for interaction with the interactive content (ads, applications, etc.). For online advertising there is a need for simple, clear and universal definition for metrics, such as user engagement, user satisfaction and holistic user experience for user interaction with the device while performing tasks as search for information, social interactions (communication, commenting, sharing etc.), input forms, playing games or interacting with an ad.

User engagement, among other factors, such as accessibility, performance, usability, human factors and design, has a big impact on user experience for interactions with the system.

Measuring these is rather complex, it gets complicated at the mere beginning, since the accuracy of modern measurement tools for visitor data analysis seems to differ [3]. And these tools only tend to offer raw data in numbers, letting the owners themselves to interpret how users engaged with the content and how good was their experience.

Even bigger concern is measuring marketing attribution online [4][5], since the marketeers tend to advertise cross-channel and users also use multiple devices, which makes it unclear wether users behaviour was driven by an ad and if so, to what extent? By user behaviour we mean either action that was performed online (i.e. user subscribed/booked a test drive) or offline (purchased an item in a store), which makes it completely unclear wether the later was influenced by an online ad or not, making it hard to measure the return on investment (ROI).[6]

Doctoral dissertation will be focused on evaluation of user engagement and user experience for different types of user interfaces and interactions in mobile content/ads and their possible impact on ROI. As a byproduct we will define guidelines for production of more efficient content/ads.

Different research methods will be used, as described below.

Related work:

It is known that context [19] greatly impacts information consumption and so does situation [16], location and time.[17] Stress and feelings also influences how user perceive working with a device.[18] With HTML5 standard it is possible to access device sensors, use their capabilities [9] and gather lots of useful information about our user, which can contribute to form better, more advanced and user specific interfaces.

Some studies has shown that ads are often automatically ignored or unnoticed (so called banner blindness) [21] [22] [23] [45], are easily forgotten [24] and can in some cases even cause damage to the advertised brand or placement, where they appear.[13][25]

There has been studies on optimising constrained budget spend in search advertising [26], measuring ad effectiveness using geo experiments [27] [28], both held at Google. Researchers at Yahoo, Celtra and Microsoft are testing different methods on how to effectively predict number of ad clicks per impressions (click-through rate, CTR), based on ad's multimedia features.[29][31] This can be odd, since the CTR is considered reasonably high, when it is around 1 %. The later measurements are often used to compute expected revenue, due to different business models, such as cost per impression or cost per click, etc.

User experience when interacting with (mobile) systems tends to be best when it is guided, emotional and clearly encourages user to perform certain tasks. User experience is influenced by the following factors:

  • trust and credibility; familiarity, visual design, aesthetics, shape, content and interaction has great impact on users credibility [43];
  • visual complexity; Youtube research lab for user experience figured out that users prefer simple over complex designs and that they prefer designs they are already familiar with [7];
  • usability and accessibility;[32]
  • aesthetics; [33] [34] [35]
  • content type; researches show that video content greatly impacts users dwell time and in case of product videos, increases confidence in online purchase decisions;[40]
  • emotions.[18]

Numerous studies claims that better technical solutions, as load times, speed and browser performance improves user experience, reduce costs and also increases revenue. [36] [37] [38]

Perhaps one of the most successful and widely known researches made in the field of user interfaces and their impact on interaction, was the Amazon’s online purchase process usability research, where the slight adjustments (commonly referfed to as "The $300 Million Button") made an enormous impact on revenue, increasing it by 45 %.[41] [42]

Research methods and hypothesis

Research methods

HTML5 web standard has been adopted as the de-facto standard for interactive rich media mobile advertising (ads). Its openness makes it easier to extract certain multimedia features from the content, such as text, audio, video, animations, images, buttons, as well as other metadata (aspect-ratio, format, banner size in pixels and kilobytes, etc) and limited information about the user (device type (tablet, smartphone or desktop), platform version and their network information). It also allows the access to sensors and other device capabilities [9]. Image features, such as brightness, saturation, colorfulness, contrast, naturalness and hue, can also be extracted from banner screenshots.

Analysis will be done on real data about user behaviour on mobile ads based on real marketing campaigns and also on dummy ads for better comparison. User testings will also be conducted, if necessary.

Some metrics than can be used and categorised as a subset of metrics that enables us to better understand user interaction and possibly experience, are listed below:

  • number of impressions and clicks;
  • time spent for interaction; this can be misleading since time spent can have different meaning. With goal specific content (i.e. weather app), lesser time spent with an app is better then with entertainment content (i.e. games, videos, photo galleries, animations, etc.), where longer dwell times are desired goal. This must be taken into account.
  • time needed to achieve certain goal or to complete a task;
  • number of components user interacted with: read articles/comments, video views, photo views, product views.
  • screen views, unique views;
  • the percentage of exit from a screen;
  • behaviour flow;
  • content specific actions, as content shares/recommendation, purchase, subscription etc.

As part of the thesis, the following research methods can or will be used:

  • analysis of patterns that are present in mobile advertisement (user interfaces, their elements and types of interaction; possible improvements in terms of parameterisation, standardisation and evaluation of user experience);
  • analysis of impact of performance (network speed, latency, response times) of served ads;
  • manual content segmentation and automatic data processing (statistical data evaluation);
  • experiment design and analysis, i.e. A/B testing, multivariate statistics,
  • the use of advanced techniques to detect correspondences between gathered data (data mining),
  • machine learning and image processing.

Hypothesis

  1. H1. Diverse user interfaces and interactions has different, measurable impact on user experience in interactive mobile display ads
  2. H2. Based on test results or analysis of a large dataset of user interactions with mobile ads, it is possible to define metrics for user experience (in mobile advertisement) based on user's interaction with different user interfaces;
  3. H3. Based on defined metrics (see H2), we can quantitatively evaluate user's experience and engagement for mobile advertisement;
  4. H4. It is possible to predict user engagement for interaction with mobile content.

Expected contribution to the science

Broader goals of research are effective understanding of users and their behaviour when interacting with mobile content, evaluation and improvement of their experience, preferably task and content agnostic. The goals also correspond well with industry needs:

  • model for evaluation of user experience for different types of user interfaces and interactions with mobile ads;
  • improved guidelines for interactive mobile content;
  • understanding the impact of user experience of mobile ads on conversion and marketing attribution.

The research findings will also contribute to faster perception and understandings of the content, improve task or problem solving, reduce users frustrations and discomforts when interacting with the mobile devices and overall improve user experience in mobile interaction, with the emphasis on mobile advertising. There are plenty of advertising techniques that users hate [13], all of which we must avoid to prevent damage to advertised brand or placement, where ads appear.


PhD candidate:
Robert Sedovšek, univ. dipl. inž. graf. tehnol.
Mentor:
doc. dr. Aleš Hladnik

Literature

  1. O’Brien, H. L., Toms, E. G. Examining the generalizability of the User Engagement Scale (UES) in exploratory search. Information Processing & Management, 2012.
  2. Attfield, S., Kazai, G., Lalmas, M., Piwowarski, B. Towards a science of user engagement (Position Paper). WSDM Workshop on User Modelling for Web Applications. 2011.
  3. Fishkin, R. SEOmoz: Testing the Accuracy of Visitor Data from Alexa, Compete, Google Trends, Doubleclick & Quantcast. http://www.seomoz.org/blog/testing-accuracy-visitor-data-alexa-compete-google-trends-quantcast. January 10, 2012, cited May 28, 2013
  4. IAB Attribution Primer. http://www.iab.net/media/file/AttributionPrimer.pdf Cited May 21, 2013
  5. Miller, S. Digital Marketing Attribution. http://www.dmnews.com/digital-marketing-attribution/article/279065/ Cited May 21, 2013
  6. Fisher, L. Simply Zesty: The One Thing We've All Got Wrong For The ROI Of Digital. http://www.simplyzesty.com/Blog/Article/April-2013/The-One-Thing-We-ve-All-Got-Wrong-For-The-ROI-Of-Digital. April 18, 2013, cited May 28, 2013
  7. Tuch, A. N., Presslaber, E. E., Stöcklin, M., Opwis, K in Bargas-Avila, J. A. The role of visual complexity and prototypicality regarding first impression of websites: Working towards understanding aesthetic judgements. University of Basel, Department of Psychology, Center for Cognitive Psychology and Methodology, 2012, vol. 70(11), 2012, page 794-811.
  8. Lindstrom, M. Buyology: How Everything We Believe About Why We Buy is Wrong. New York, Random House, 2009.
  9. Wroblewski, L. Mobile Device Capabilities. http://www.lukew.com/ff/entry.asp?1140 Published June 6, 2010, cited May 20, 2013
  10. Wroblewski, L. Google I/O 2013: Just the Data. http://www.lukew.com/ff/entry.asp?1723 Published May 15, 2013, cited May 26, 2013
  11. Google I/O 2013: Keynote. https://developers.google.com/events/io/. May 2013, cited May 26, 2013
  12. Peternel, K., Pogačnik, M., Tavčar, R. in Kos, A. A Presence-Based Context-Aware Chronic Stress Recognition System. Sensors, 2012, vol. 12, no. 11, page 15888–15906.
  13. Nielsen, J. The Most Hated Advertising Techniques. http://www.nngroup.com/articles/most-hated-advertising-techniques/. December 6, 2004, cited May 24, 2013
  14. Nylander, S., Landquist, T., Brännström, A. in Karlson, B. “It’s Just Easier with the Phone” – A Diary Study of Internet Access from Cell Phones. Lecture Notes in Computer Science, 2009, vol. 5538, page 354–371.
  15. Nylander, S., Lundquist, T., Brännström, A. At home and with computer access: why and where people use cell phones to access the internet. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2009, page 1639–1642.
  16. Mashita, T., Komaki, D., Iwata, M., Shimatani, K., Miyamoto, H., Hara, T., Kiyokawa, K., Takemura, H., Nishio, S. A content search system for mobile devices based on user context recognition. Proceedings of the 2012 IEEE Virtual Reality (VR '12), 2012. IEEE Computer Society, Washington, DC, USA, 2012, page 1–4.
  17. Church, K, Smyth, B. Understanding the intent behind mobile information needs. Proceedings of the 14th international conference on Intelligent user interfaces, 2009, page 247–256.
  18. Norman, D. A. Emotional design: why we love (or hate) everyday things. New York, Basic Books, 2004.
  19. A. Broder, M. Fontoura, V. Josifovski, L. Riedel. A semantic approach to contextual advertising. V Proceedings of the 30th annual international ACM SIGIR conference on Research and development in information retrieval, page 559-566. ACM, 2007.
  20. Interactive Advertising Bureau (IAB). Internet advertising revenue report, 2012 full year results. http://www.iab.net/media/file/IAB_Internet_Advertising_Revenue_Report_FY_2012_rev.pdf. April 2013, cited May 20, 2013
  21. Benway, J. P., Lane, D. M. Banner Blindness: Web Searchers Often Miss "Obvious" Links. Internetworking: ITG Newsletter, 1998, vol. 1, no. 3.
  22. Norman, D. A. Commentary: Banner Blindness, Human Cognition and Web Design. Internetworking, 1999
  23. Benway, J. P. Banner blindness: The irony of attention grabbing on the World Wide Web. Proceedings of the Human Factors and Ergonomics Society 42nd Annual Meeting, 1998, page 463–467.
  24. Burke, M., Hornof, A., Nilsen, E., Gorman, N. High-cost banner blindness: Ads increase perceived workload, hinder visual search, and are forgotten. ACM Transactions on Computer-Human Interaction, 2005, vol. 12, no. 4, page 423–445.
  25. Goldstein, D. G., McAfee, P. R., Suri, S. The cost of annoying ads. Proceedings of the 22nd international conference on World Wide Web (WWW '13). International World Wide Web Conferences Steering Committee, Republic and Canton of Geneva, Switzerland, 2013, page 459–470.
  26. Karande, C., Mehta, A., Srikant, R. Optimizing budget constrained spend in search advertising. Proceedings of the sixth ACM international conference on Web search and data mining (WSDM '13). ACM, New York, NY, USA, 2013, page 697–706.
  27. Vaver, J., Koehler, J. Measuring Ad Effectiveness Using Geo Experiments. Google Inc., 2011.
  28. Vaver, J., Koehler, J. Periodic Measurement of Advertising Effectiveness Using Multiple-Test-Period Geo Experiments. Google Inc., 2012.
  29. Cheng, H., Roelof van Zwol, Azimi, J., Manavoglu, E., Zhang, R., Zhou, Y., Navalpakkam, V. Multimedia features for click prediction of new ads in display advertising. Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining (KDD '12). ACM, New York, NY, USA, 2012, page 777–785.
  30. Petrovič, R., Dali, L. Mladenič, D. Click Prediction in Mobile Display Advertising Based on HTML5 Features. Celtra d.o.o., 2013.
  31. Dembczynski, K., Kotłowski, W., Weiss, D. Predicting ads' click-through rate with decision rules. Workshop on Targeting and Ranking in Online Advertising. vol. 2008. 2008.
  32. Tractinsky, N., Shoval-Katz, A., Ikar, D. What is beautiful is usable. Interacting with Computers. 2000, page 127–145.
  33. De Angeli, A., Sutcliffe, A in Hartmann, J. Interaction, Usability and Aesthetics: What Influences Users’ Preferences? Centre for HCI Design, School of Informatics, University of Manchester, 2006, page 271–280.
  34. Toomim, M., Kriplean, T., Portner, C., Landay, J. A. Utility of human-computer interactions: Toward a science of preference measurement. Proceedings of CHI 2011: ACM Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, 2011, page 2275–2284.
  35. O’Brian, H., Toms, E. What is user engagement? A conceptual framework for defining user engagement with technology. Journal of the American Society for Information Science and Technology, 2008, vol. 59, no. 6, page 938–955.
  36. Interactive Advertising Bureau (IAB). Ad Load Performance Best Practices. http://www.iab.net/media/file/IAB_Ad_Load_Perfomance_BP_FINAL.pdf. October 2008, cited May 28, 2013
  37. Simic, B. “The Performance of Web Applications: Customers Are Won or Lost in One Second”. Aberdeen Group, 2008.
  38. Schurman, E., Brutlag, J. “Performance Related Changes and their User Impact”. Microsoft, Google. Velocity Conference, San Jose, CA, USA, 2009.
  39. Mobile Marketing Association (MMA), Interactive Advertising Bureau (IAB), Media Rating Council (MRC). Mobile Application Advertising Measurement Guidelines. http://www.iab.net/media/file/MobileAppsAdGuidelinesv1.0FINAL.pdf. February 2013, cited May 28, 2013
  40. Invodo. Video Statistics: The Marketer's Summary. http://www.shop.org/sites/default/files/invodo_-_video_statistics_-_the_marketers_summary_february_2013_0.pdf. 2013, cited May 28, 2013
  41. Spool, J.M. How Changing a Button Increased a Site's Annual Revenues by $300 Million. User Interface Engineering, 2009.
  42. Spool, J.M. The Back Story for the $300 Million Button. User Interface Engineering, 2011.
  43. The Web Credibility Project - Stanford University: Publications. http://credibility.stanford.edu/publications.html. 2007, cited May 28, 2013
  44. Nielsen, J. Will plain text ads continue to rule? Nielsen Norman Group, 2003.
  45. Arnheim, R. Visual Thinking. Berkeley, University of California Press, 2004.
  46. Arnheim, R. Art and Visual Perception: A Psychology of the Creative Eye. Berkeley, University of California Press, 2004.
  47. Krug, S. Don't Make Me Think: A Common Sense Approach to Web Usability. 2nd Edition. New Jersey, New Riders, 2006.
  48. Stein, C., Cormen, T. H., Rivest, R. in Leiserson, C. E. Introduction to Algorithms. 2nd Edition. New York, MIT Press and McGraw-Hill, 2001.
  49. Skiena, S. S. The Algorithm Design Manual. New York, Springer, 1998.
  50. Dobelli, R. The Art of Thinking Clearly: Better Thinking, Better Decisions. London, Hodder & Stoughton, 2013.
  51. Ries, A. in Trout, J. Positioning: The Battle for Your Mind.New York, McGrew–Hill, 2000.
  52. Norman, D. A. The design of everyday things. New York, Doubleday. 1990.
  53. Norman, D. A. The invisible computer: why good products can fail, the personal computer is so complex, and information appliances are the solution. Cambridge, The MIT Press, London, Mass, 1999.
  54. Giardina, A., Vasa R. in Cian Tan, F.T. Impact of viral propagation on user interface design. Proceeding: Proceedings of the 24th Australian Computer-Human Interaction Conference, 2012, ACM, New York, 2012, page 154–157.
  55. Calder, B.J., Malthouse, E.C. in Schaedel, U. An Experimental Study of the Relationship between Online Engagement and Advertising Effectiveness. Journal of Interactive Marketing, 2009, vol. 23, no. 4, page 321–331.
  56. Medhi, I., Lakshmanan, M., Toyama, K. in Cutrell, E. Some Evidence for the Impact of Limited Education on Hierarchical User Interface Navigation. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2013, page 2813–2822.
  57. Roy, S. D., Lotan, G. in Zeng, W. Social Multimedia Signals: Sense, Process, and Put Them to Work. MultiMedia, IEEE, 2013, vol. 20, no. 13.