You are currently viewing a placeholder content from Youtube. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.
To promote digital transformation, equal emphasis needs to be allocated to digital skills development as is to infrastructure development. To ensure digital training programmes are adequately managed, a standardized data collection strategy is required to measure an internationally accepted Digital Literacy Index. This index must be defined by a dynamic definition of Digital Literacy responsive to the fluid needs of the digital economy. This requires the introduction of a G20 standard-setting body informing a nationally representative data collection strategy. Furthermore, the adopted standards which inform the data collection process must be cognizant of the evolving demands of employers.
Challenge
Recent G20 communiques of 2015 and 2016 have made pronouncements on addressing the digital divide but have largely focused on infrastructure development, financial inclusion or digital trade. Insufficient attention has been paid to the need for digital skills t, partly due to the difficulties in defining and measuring Digital Literacy. Digital Literacy, as with general literacy, provides an individual with the capability to achieve other valued outputs in life, especially in the modern digital economy. Unlike, literacy, the definition of Digital Literacy is contested, leading to the development of different and inconsistent sets of indicators for measuring Digital Literacy. A consistent, standardised definition of Digital Literacy is required across G20 countries for the purposes of data collection, analyses and measurement.
A Digital Literacy measurement offers policy makers a means to monitor the diffusion of digital skills across countries. For effective alignment in measuring Digital Literacy policy makers should guard against emphasising a one dimensional approach and focus on the multi-dimensional nature of Digital Literacy instead of a narrow focus on technical usage.
Amongst emerging and developing economies specifically, whilst there is an emphasis on infrastructure development, these countries will not be able to leverage the full potential of such investments, without a comprehensive skilling programme which educates the currently disadvantaged and disconnected population about the benefits of digital tools. Furthermore, without consistent and comparative measurement indicators to identify the location of the digitally illiterate sectors of the population, policy makers are disempowered to pursue digital transformation objectives.
Proposal
G20 policy makers urgently require an indicator measuring the uptake of Digital Literacy across countries to ensure that policies are targeted to the areas of most need. To ensure this becomes a reality, the following proposals for the G20 are made: (1) adopt a standardized, multi-dimensional definition of digital literacy, (2) produce a standardized multi-dimensional Digital Literacy Index and (3) align of the demand and supply of digital skills required by all role players.
The G20 should adopt a standardized definition to measure Digital Literacy
Digital Literacy provides an individual with core capabilities to achieve valued outputs in life. It is a critical enabler for digital transformation as it enables employment opportunities, the ability to access digital content (the medium for the digital economy) and online services. Crucially, there is no universal accepted definition for Digital Literacy. The G20 needs a commonly accepted definition for Digital Literacy, supported by a standards-setting body. Academics, the public sector and the business sector have not reached consensus of what specifically constitutes Digital Literacy. Consequently, there are no comparable measurements of Digital Literacy to measure the progress in attaining such skills, particularly in emerging and developing economies combating the effects of the digital divide.
Drawing on recent literature from UNESCO (2011), the SCONUL Working Group on Information Literacy (2011), Lankshear and Knobel (2008), Greene, Yu and Copeland (2014), SIEMENS (2017), Covello (2010), McKinsey & Company (2014), Ridsdale et al. (2015), and various other contributers to the Digital Literacy it is clear that Digital Literacy is a muliti-dimensional concept. However, the authors differ over the specific set of dimensions which consitute Digital Literacy. Our study identifies five dimensions, viz., Information Literacy, Computer Literacy, Media Literacy, Communication Literacy and Technology Literacy. Each dimension is further influenced in terms of three perspectives, viz., Cognitive, Technical and Ethical (See Table 1). These five dimensions and three perspectives broadly refer to all the conceptual components of Digital Literacy, and should underpin how Digital Literacy is defined, measured and taught.
UNESCO (2011) describes Digital Literacy as a set of basic skills required for working with digital media, information processing and retrieval. It also enables one’s participation in social networks for the creation and sharing of knowledge, and the ability supports a wide range of professional computing skills. However, focusing uniquely on technical aspects of digital literacy such as accessing and using tools to the exclusion of an awareness of the cognitive and ethical concerns of digital technologies poses a longterm risk for users. Cognitively, a user is constantly processing content, evaluating, critiquing and synthesising multiple sources of information. Concurrently, the user must also be cogniscant of what constitutes the appropriate use of such tools. Knowing how to discern what is appropriate and how to derive meaning whilst using digital technologies is equally important as using the technology itself.
Dimension | Perspective | ||
Cognitive | Technical | Ethical | |
Information (Digital Content) | Synthesis | Access, Usage | Appropriate Usage |
Computer (Hardware and software) | Evaluate | Usage | Appropriate Usage |
Media (Text, sound, image, video, social) | Critique, Create | Navigation | Assess truthfulness |
Communication (non-linear interaction) | Critique, Create | Develop and use content | Appropriate Usage |
Technology (Tools for life situations) | Invent, evaluate tools | Usage | Appropriate usage |
Table 1: Simplified representation of the Digital Literacy dimensions and perspectives
A key point raised by UNESCO, is that Digital Literacy improves one’s employability because it is considered a ‘gate’ skill required by employers. It is a catalyst for individuals to acquire other valued outcomes.
Benefits of measuring digital literacy
Appropriately measuring digital literacy and consistently ensuring that policies are agile enough to react to the dynamic nature of digital skills, will lead to productivity gains across a country. Bunker (2010) attributes this productivity gain to a greater share of both employers and employees that meet the basic needs of digital literacy and those that attain a greater level of mastery of such digital technologies. With a greater number of employees, with an internationally competitive skills level, operating in the product and services sectors, there is an expected benefit to both employers and the national economies.
Through a quantitative understanding of the location, dimensions and nature of a population’s collective state of literacy, policy makers are better prepared to make the necessary choices to ensure digital transformation. A Digital Literacy indicator and data collection strategy informed by the broad dimensions of Digital Literacy, as identified in Table 1, will enable the policy maker to specify goals, set targets and plan appropriately (Oxenham, 2008). If one assumes that progress to a completely digitally literate population will be progressively realized over time, one must keep track of the rates of Digital Literacy attainment.
Weaknesses of the current definitions of Digital Literacy
To attain the holistic view of Digital Literacy required by policy makers, an inclusive composite measurement or index is needed. However, the current measurements of digital literacy suffer in the following respects:
- Private agencies have adopted a narrow conceptual view of Digital Literacy. Measurements tend to focus only on the technical perspective of the various dimensions of Digital Literacy highlighted in Table 1.
- The sampling strategies adopted in current data collection instruments are not suitably representative of the country leading to invalid conclusions.
- Digital Literacy measurement instruments are only accessible online, thus excluding vast portions of workforce without access to such facilities.
- The proxies of Digital Literacy as adopted by private agencies are not suitably representative of the complexities of digital literacy. E.g. Facebook usage or access does not infer Digital Literacy.
- Digital Literacy assessments will need to evolve with the ever-developing modes of creativity and educational methods (UNESCO, 2011).
G20 nations should produce a standardized multi-dimensional Digital Literacy Index
A well-executed Digital Literacy measurement strategy will allow countries to track their trajectories to attaining full Digital Literacy and international competitivity. There is an ever-increasing need to understand the fluid nature of what constitutes Digital Literacy in the modern economy. Current skills considered to be of a superior level of mastery may well become the future expected skills norm. To be prepared for such eventualities the G20 requires (a) a Digital Literacy standard setting body, (b) a Digital Literacy assessment data collection instrument informed by a representative sampling strategies, and (c) should base the assessment instrument on the multi-dimensional nature of Digital Literacy.
Standardisation of Digital Literacy across the G20
This study recommends that the G20 institutes a Digital Literacy standard-setting body as a further progression of the existing G20 Skills Strategy (G20 Leaders, 2015b). As Digital Literacy will remain a dynamic concept, this body will be responsible for maintaining its definition, its underlying set of dimensions and identifying the most appropriate means for performing a Digital Literacy assessment.
The PIRLS assessment framework (used for Literacy) follows the guidelines of the International Standard Classification of Education (ISCED) and is managed by the International Association for the Evaluation of Educational Achievement (IEA). It is preferable that a similar organisation carries out this function to inform how an internationally consistent assessment should be conducted. The body should also ensure that similar internationally accepted standards are adopted which informs each dimension of Digital Literacy. Furthermore, the body will oversee the appropriate data collection agencies within the G20 and guide their data collection efforts.
The standards setting body must pay particular attention to the dynamic requirements of business entities. In doing so, the body will inform the G20’s digital skills training agencies of the minimum requirements needed in the business sector. For example the USA‘s Northstar (Northstar, n.d.) training programmes may be used as a benchmark model. In these programmes in which satisfactory performing learners are awarded an appropriate certification, recognised by the business sector. Such certification and recognition will enable a through-put of new entrants into formal employment. With greater employment opportunities deriving from such a certification process, there will be greater incentives for learners to enroll in Digital Literacy training programmes.
Data collection strategy informed by a representative sampling
To develop a comprehensive composite Digital Literacy indicator to measure the degree of competence amongst the population, it is suggested that a multi-dimensional data collection instrument is designed and administered by the G20’s national research or data collection bodies (informed by the G20 standard-setting body). In order to produce nationally representative results, and considering the low levels of internet and mobile access in emerging economies together with the high costs of internet access (McKinsey&Company, 2014), a regular survey informed by a representative sampling exercise of the national population, could be conducted. Joncas and Foy (2012) discuss the process followed in measuring international literacy via the Trends in International Mathematics and Science Study (TIMSS) and Progress in International Reading Literacy Study (PIRLS) data collection instruments and in essence highlighting that rigorous sampling exercises are needed across countries, to ensure their target population is estimated correctly. This sample must be age group appropriate targeting all individuals comprising the countries‘ work-force. Depending on the country, this may range from 15 to 65 years.
The literacy assessment includes a written test of comprehension and additional questions which target the various factors associated with the development of reading literacy (Anand et al., n.d.). A similar but more complex process is required to measure the multi-dimensional nature of Digital Literacy, whereby, the data collection must include a pure literacy assessment based on ability, whilst also capturing the ancillary factors in support of the various dimensions of Digital Literacy.
Develop a multi-dimensional Digital Literacy Index
The structure of the Digital Literacy assessment should be informed by the broad understanding of Digital Literacy, inclusive of Information, Computer, Media, Communication and Technology Literacies, whilst addressing the cognitive, technological and ethical considerations, and as adopted by the proposed G20 Digital Literacy standard-setting body.
It is also crucial to be able to disaggregate the composite Digital Literacy Index by dimension and perspective. For example, business place greater significance on the technical perspective of each dimension of Digital Literacy. Therefore, as discussed in the OECD (2001) study, a Digital Literacy (Technical Perspective) composite index measurement is equally important as the overall Digital Literacy composite index measurement for planning purposes. The same presentation of the Digital Literacy indicator is equally valid for each dimension and perspective discussed in Table 1.
Ultimately the overall Digital Literacy index that is produced per country, must equally balance each dimension and perspective. Through the introduction of such an index, it is envisioned that policy makers will be empowered to target policy to the most-affected and disadvantaged portions of the population lacking the core sets of skills valued by employers.
Alignment of the Demand and Supply of Digital Skills in the G20
It is critical to understand the connection between the demand for digital skills amongst the employer’s and higher education institution’s requirements, and the supply of digital skills through school based digital skills training programmes or vocational training programmes. To this end, policy makers in Education departments across the G20 must be suitably informed by the proposed G20 standard-setting body of the minimum requirements for Digital Literacy certification. Furthermore, higher education institutions also need to be agile enough to adapt to this fluid set of requirements. Although alignment is complex, considering the need to change curricula based on the changing set of standards, countries must make an effort to adapt.
Employers actively engaged in the digital economy, require a varied collection of digital skills to compete in the dynamic modern economy. Learner’s that leave the school system must be adequately prepared to compete in this fast-paced job market. To this end, a measurement which scores the abilities of its exiting learners from schools and the existing workforce, must assess per person, what collection of skills that they possess. The OECD (2001) recognized the changing minimum set of educative requirements needed for the modern knowledge economy and noted it is possible for an individual to have a high level of digital literacy and weak level of education. The core elements valued by employers tended to be those with basic technology/ICT knowledge. However, employees in more knowledge intensive positions require a greater depth of knowledge pertaining to the interpretation, analysis and evaluation of data.
This scaling of digital skills requirements points to a job trajectory informed by the literacy, fluency and mastery scale as discussed by Ridsdale et al. (2015) where a data scientist would fall in the master level, whilst analysts would generally attain fluency and low-level employees or new entrants to the job market would possess basic literacy. The crucial point is that a digitally literate employee is the minimum requirement to enter such organisations. Without attaining such knowledge and skills, an individual does not possess the necessary capabilities to contribute to the modern economy.
References
- Anand, G., Chu, H., Wang, J., Morales, L., Pan, O., French, S., … Li, Z. (n.d.).
- Bunker, B. (2010).
A Summary of International Reports, Research and Case Studies of Digital Literacy – Including implications for New Zealand of adopting a globally-recognised digital literacy standard. - Covello, S. (2010). Syracuse University
A review of digital literacy assessment instruments - Department of Education. (2006).
White Paper on e-Education. - G20 Leaders. (2015a).
G20 Leaders’ Communiqué, Antalya Summit, 15-16 November 2015 - G20 Leaders. (2015b).
G20 Skills Strategy. - Greene, J. A., Yu, S. B., & Copeland, D. Z. (2014); Computers and Education, 76, 55–69.
Measuring critical components of digital literacy and their relationships with learning. - Helsper, E. J., & Van Deursen, A. (2015).
Digital Skills in Europe : Research and Policy. In Digital Divides. - Joncas, M., & Foy, P. (2012).
Sample Design in TIMSS and PIRLS. Methods and procedures. Methods and procedures in TIMSS and PIRLS 2011. - Lankshear, C., & Knobel, M. (2008).
Offline and falling behind: Barriers to Internet adoption. - Northstar. (n.d.)
Northstar Basic Computer Skills Certificate. - OECD. (2001)
Competencies for the knowledge economy. Education Policy Analysis 2001. - Oxenham (2008).
Fundamentals of Education Planning, Effective literacy programmes: options for policy-makers. UNESCO, International Institute for Educational Planning. - Ridsdale, C., Rothwell, J., Smit, M., Ali-Hassan, H., Bliemel, M., Irvine, D., … Wuetherick, B. (2015).
Strategies and Best Practices for Data Literacy Education: Knowledge Synthesis Report - SCONUL Working Group on Information Literacy. (2011).
The SCONUL Seven Pillars of Information Literacy. - SIEMENS. (2017).
African Digitalization Maturity Report. - UNESCO Institute for Information Technologies in Education. (2011).
Digital Literacy in Education. - World Economic Forum. (2016).
Digital Transformation of Industries : Digital Enterprise.