Self-Perceived Digital Competencies in Educational Online Migration Due to COVID-19 Confinement

Objectives: The purpose of the study was to validate a measurement scale to assess self-perceived digital competencies of Mexican university students who have migrated from a mixed school-digital system to a fully digitalized educational environment because of COVID-19 confinement. The instrument was based on the European Union Digital Competence Framework. Method: 1,118 participants aged between 18 and 47 years completed the assessment. This included 677 females (60.6%) and 429 males (38.4%). Confirmatory factor analysis (CFA) was used to assess scale structure. Results: Results of the CFA showed an excellent fit to the data, 2/df = 3.27, p = .01, comparative fit index (CFI) = 0.97, root-mean-square error of approximation (RMSEA) = 0.05. Cronbach’s  values for the scale and subscales ranged from .78 to .83. Three of the four subscales predicted number of assignments completed and submitted, a measure of predictive validity. Conclusions: The Digital Competence Scale for Online Migration is a valid scale for Mexican students and demonstrates predictive validity. Implication for Theory and/or Practice: The scale may be useful in planning activities for the reinforcement of digital competencies and to identify difficulties and support specific pedagogy for online teaching/learning processes.


Introduction
On December 30, 2019, the health department of Wuhan city, Hubei Province of the People's Republic of China, reported a set of pneumonia cases of unknown etiology. On January 9, 2020, Chinese scientists determined that the cause was a new type of coronavirus (Liu et al., 2020). By January 30, 2020, the new disease had expanded to China's 11 provinces . On January 30, 2020, the WHO declared the new disease a public health emergency of international concern (PHEIC; Xiao et al., 2020;Zhou, 2020). International health regulations (WHO, 2005) define PHEIC as an extraordinary event that poses a public health risk to other states and may require immediate action. Since its signature by the WHO member countries in 2005, PHEIC has been declared five times in relation to the outbreaks of H1N1 influenza in 2009, polio in 2014, Ebola in 2014, Zika in 2016, and the new coronavirus in 2019.
On February 11, 2020, WHO, according to its own case definitions guidelines, named the novel coronavirus COVID-19 (Pan American Health Organization [PAHO], 2020a). On March 11, 2020, WHO stated that, after confirming more than 118,000 cases and 4,291 deaths in 114 countries, COVID-19 could be considered the first pandemic caused by a coronavirus in history and called on countries to take urgent and aggressive prevention measures in the absence of specific treatment (WHO, 2020).
In Mexico, the first confirmed case of COVID-19 occurred on February 27, 2020 (Worldometer, 2020). On March 14, the Ministry of Public Education (SEP) announced the suspension of school activities and the extension of Easter vacation from March 20 to April 20 (Jiménez, 2020). On March 23, WHO placed Mexico in Phase 2 of the pandemic scale, even though national authorities did not accept it (Morán, 2020). Yet, on March 24, the Mexican government declared the start of Phase 2 and called for a preventive quarantine, asking people to stay at home while allowing only the activities of the essential sectors (Arista, 2020). Despite quarantine, the disease reached 54,346 confirmed cases and 5,666 deaths by May 21 (PAHO, 2020b).
Mexico's educational authority did suspend face-to-face school activities before the federal government ordered the suspension of many other social and economic activities. This decision was based on experience acquired in containing previous influenza outbreaks; the measures then taken helped to reduce the number of influenza cases and delay peaks of maximum contagion (Viner et al., 2020). By March 23, 2020, 29 other countries did close schools at different levels according to infection rates. Mexico's higher education sector, like most universities globally, quickly moved to provide classes in compliance with social isolation measures and under government sanitary guidance (Crawford et al., 2020). One implemented strategy was the digitization of the curriculum; the digital curriculum has a crucial role nowadays because it allows rapid implementation and international expansion .

Literature review
Due to COVID-19, many universities around the world closed their campuses and migrated learning, teaching, and assessment activities to digital environments (Watermeyer et al., 2020). However, such a move has been made without the appropriate online pedagogical methods, creating an imbalance between the new approaches and educational quality (Crawford et al., 2020). The impacts of this migration on academic communities have been extensive. Faculty members are facing an emergency case of remote teaching rather than simply designing and delivering usual online courses, an upturn that has diminished time for planning, teaching, and quality measures' implementation while reducing student attendance (O'Keefe et al., 2020). Additionally, faculty members are now compelled to coordinate online instructional design and student learning, effectively deliver online instructions, provide adequate and timely support for both teachers and students, assure high-quality participation of teachers and students, and make provisions for contingency plans to deal with unexpected needs and incidents arising from the novel educational platforms (Bao, 2020).
These are not easy tasks. Faculties are facing the challenges, including a lack of online teaching experience, preparation, or support from educational technology teams. Additionally, many students lack distant learning experience and face technical operational obstacles. Faculties must make provisions to overcome server saturation situations in hosting a large number of users, get online support from teaching assistants, and reinforce the digital abilities of both students and teachers (Bao, 2020).
Digital ability is the set of skills, knowledge, and attitudes required for a person to be proficient in digital environments and make creative, critical, and safe use of information and communication technologies. These competencies are useful in achieving objectives related to work, employability, learning, free-time use, inclusion, and participation in society (Carretero et al., 2017). Regarding student and teacher abilities to understand and use online settings for useful education, digital competencies are a key resource. It is usually taken for granted that higher education students and teachers have adequate digital competencies, but this is not always the case. Many of them are not able to manage the required levels of social, cognitive, and learning participation in a fully online educational environment. This suggests that digital competency development must be considered an educational priority (Blayone, 2018).
According to the European Union Digital Competence Framework, digital competence comprises the following dimensions: (a) information and data literacy that allows people to locate, retrieve, store, manage, and organize digital data, information, and content; (b) communication and collaboration through digital technologies; (c) digital content creation and knowledge about giving the right instructions to a computer system; (d) identity and security management of devices, content, personal data and privacy protection in digital environments; and (e) ability to solve conceptual problems and unsettled situations in digital environments (Carretero et al., 2017). Competence in these dimensions is required to participate in increasingly visual and digital educational environments (Matrix & Hodson, 2014). It helps students find new and valid information related to their field of study by using tools such as digital libraries, databases, web portals, blogs, and social networks through which they could disseminate their own knowledge and express themselves in a range of languages (Moreira, 2010).
In order to implement digital training programs in higher education systems, reliable measurement of needs and proficiency levels are required. There is a vast number of scales intended for this purpose. According to Calvani et al. (2009), these measurement tools can be classified into objective and subjective. Objective measures include class performance testing or rubrics and automated performance testing on simulation and online platforms. Subjective measures include self-reporting tools for the perception of digital skills.
Objective forms of performance testing are commonly used to assess digital competencies in a given domain in a practical way, yet they are often limited to specific software programs that can leave knowledge and abilities untested (Maderick et al., 2015). Subjective assessments are less specific than the objective ones, and they allow assessment of user attitudes regarding digital competence (Fite et al., 2009); however, they can be inaccurate and tend to have little correlation with the results of objective measures (McCourt Larres et al., 2003). However, subjective measures are used because they are practical and solve the problem of software specificity (Ghomi & Redecker, 2019). Therefore, it is advisable to apply both objective and subjective tests to allow users to receive full feedback regarding their skills and attitudes (Carrera et al., 2011).
There are other characteristics by which tools measuring digital skills can be classified. One of them is the targeted sector, which can comprise students, teachers, and the general population. Another one is educational level (Maderick et al., 2015). Tests can assess multiple competencies depending on the level of specificity and the assumed theoretical model (Ghomi & Redecker, 2019). Regarding educational and occupational levels of the respondents, measurement tools vary in terms of specific software, assessed skills, and procedures applied to the requirements of each profession (Ghomi & Redecker, 2019;Maderick et al., 2015). UNESCO notes that digital competencies vary among countries, educational levels, and cultural contexts (Manos & Montoya, 2018). In view of this problem, UNESCO strongly recommends designing appropriate measurement tools or at least adapting them to the specific conditions and requirements of educational systems (Midoro, 2013).
A review of the literature on instruments used with college students showed a wide variety in the assessed dimensions of each. Table 1 includes selected examples of different instruments and the dimensions assessed in each.
As is shown in Table 1, instruments reviewed have different limitations, ranging from being made for specific hardware or software to excluding information on certain competencies such as collaboration, communication, or identity management. Therefore, we can see that there are no homogenous dimensions or theoretical backgrounds to the different tools. Consequently, the purpose of the present study was to develop and psychometrically validate a self-report instrument that measures the perceived difficulty of specific behaviors performed by students who had to migrate to online classes due to COVID-19 confinement.
The team used the dimensions of the European Union Digital Competence Framework (DigComp; Carretero et al., 2017) as its framework. This framework was chosen because it helps to improve digital competence by allowing users to set learning goals, identify training opportunities, and facilitate job search. The framework considers five areas (Carretero et al., 2017).
1. Information and Data Literacy, which allows people to locate, retrieve, store, manage, and organize digital data, information and content.

Communication and Collaboration through digital technologies.
3. Digital Content Creation that requires knowledge of how to properly use a computer system.
4. Identity and Security Management of devices, content, personal data, and privacy protection in digital environments.
5. Problem Solving, the ability to solve conceptual problems in digital environments.

Purpose of the Study and Hypotheses
The purpose of the study was to develop and validate a digital competencies questionnaire for Mexican university students in the context of online migration due to COVID-19 confinement. The questionnaire was designed to test the perception of difficulties in achieving competencies during and after the transition process. We found no published scales for assessing self-perceived digital competence.
Higher Learning Research Communications 51

Sample and Procedures
The study was approved by the campus dean and the university ethics committee of a private university in Mexico City. The sample included all teachers attached to the education department of the institution. Teachers e-mailed an invitation to their respective students to participate in the study. Participants signed an informed consent under the Federal Law on the Protection of Personal Data in Possession of Particulars of Mexico (2010). Once they consented, students answered the questionnaire, and their answers were automatically recorded in an online database.

Instrumentation
The research team designed an initial version of the Digital Competence Scale for Online Migration with 12 items. The items have the following lead-in: Below is a series of statements about situations you have experienced in classes during the COVID-19 pandemic. Please read each one and choose the option that best reflects how you feel in terms of how difficult it has been. Please make sure you answer all the questions.
Items were assessed by using a 5-point Likert-type scale ranging from Very Easy to Very Difficult. Items were designed to test four dimensions: Communication, Content Creation, Problem-Solving, and Identity Management. A panel of four judges experienced in online teaching developed a document that included specifications about the content of each dimension and wording of initial items, which are presented in Table 2.
Note that the dimension of digital literacy was not included because experts noted that this dimension involves locating, retrieving, and storing digital information, and the digital platform provided students all the needed materials for the classroom, such as readings, e-books, and research papers). To ascertain predictive validity, we asked students to report the number of assignments successfully completed since the online migration started.

Data Analysis
R and the Lavaan package for latent variable modeling (Rosseel, 2012) were used for analyses. Normality was assessed through several measures, including (a) skewness; (b) kurtosis; (c) Anderson Darling's univariate normality test, which compares the responses to each item with normal probability distribution; and (d) Henze-Zirkler's multivariate normality test (Henze & Zirkler, 1990), which compares the items' group distribution with normal distribution. Cronbach's , a measure for internal consistency, was calculated for each scale. CFA (Edwards & Wirth, 2009) was then performed to provide information about item discrimination and construct validity of The Digital Competence Scale for Online Migration. Maximum likelihood estimation (or maximum robust likelihood when data are not normally distributed) was used (West et al., 2012). Model identification was made by setting the factorial loading of the first item to 1 to define the metric of the latent variable (Kenny & Milan, 2012). Three goodness of fit indexes were used to assess the overall fit. These included the standardized X 2 divided by degrees of freedom, whose value should be ≤ 3; Bentler's CFI, which should be greater than or equal to 0.90; and the RMSEA, which should be no more than 0.08 (Hu & Bentler, 1998). To test predictive validity, we performed multiple linear regression.

Psychometric Validation
Analyses indicated that items were not normally distributed, as skewness and kurtosis were outside of acceptable normality ranges. The Anderson Darling's univariate normality test showed significant differences from normal distribution, and Henze-Zirkler (HZ) multivariate normality test (HZ = -2.902, p< .01**) showed that items were not normally distributed either individually nor as a whole.Therefore, the selection of the subsequent statistical analysis was made regarding this condition, taking care not to violate normality assumptions. Table 3 shows the results of the CFA model and its adjustment. The results of the original model showed lack of global fit to explain the data. Therefore, local fit failures were analyzed by means of modification indexes. These indicated that two items needed to be dropped from the scale: "Solve problems with my digital skills" and "Participate in class." The modified model showed proper fit across all indexes. Figure 1 shows the standardized factor loadings. The overall fit was excellent,  2 /df = 3.27, p = .01, CFI = 0.97, RMSEA = 0.05. The Digital Competence Scale for Online Migration, in the original Spanish and the English translation version, is shown in its entirety in the Appendix.

Figure 1: Confirmatory Model and Factor Loadings for the Digital Competence Scale for Online Migration
The mean and standard deviation for each dimension as well as the values for Cronbach's  are presented in Table 4. All values for Cronbach's  indicate internally consistent scales (values ranged from .78 to .83). The highest amount of difficulty was reflected in Problem Solving, followed by Communication, Content Creation, and Identity Management. Multiple linear regression analysis was performed to predict the number of assignments successfully completed. A significant model resulted, F (4, 265) = 31.27, p< 0.01, which explained 25% of the variance in number of assignments completed by three of the subscales (Communication, Content Creation, and Problem Solving). The results of the analysis are shown in Table 5.

Discussion
The purpose of the present study was to validate the Digital Competence Scale for Online Migration, a scale to assess digital competence of university students who switched from face-to-face classes to online classes because of COVID-19 preventive confinement. The dimensions of the scale coincide with four of the five dimensions of the theoretical framework most commonly used to describe digital competencies, the European Union Digital Competence Reference Framework (Carretero et al., 2017), including Communication, Content Creation, Problem Solving, and Identity Management. Digital Literacy was not included because experts noted that locating, retrieving, and storing digital information was not a requirement (Carretero et al., 2017). The institution provided all the information and resources related to online class activities (such as readings, ebooks, and research papers) through the digital platform. Compared with others, the scale has the advantage of not being specific to a particular software or hardware (Calvani et al., 2009;Desjardins et al., 2001;Guzmán-Simón et al., 2017) and its dimensions are based on a theoretical model with defined dimensions instead of general questions (Lopez-Fernandez & Rodriguez-Illera, 2009).
The Identity Management dimension is the one with the highest perceived competence level reported. This can be explained by the institutional context, where students are required to be registered as users with their full name, which is supported by the university system. This result differs from those of other studies which have reported student difficulties in handling identity online due to lack of institutional support (Calvani et al., 2012). The other three dimensions showed similar levels around the midpoint of The Digital Competence Scale for Online Migration, which suggests that digital skills represented by those domains are acceptable. However, some practices can be improved to achieve full mastery, considering that future educational needs will demand them , especially communication skills, since body language loses effectiveness online.
Problem-Solving Skills resulted to be the lowest perceived dimension, which can be explained by the fact that students should be able to propose suggestions to their teachers, an ability not easy to infuse in them since Mexican teachers are not very accepting of it (Díaz-Guerrero, 2003). Dealing with authority figures in Mexico is a complex process because of the country's collectivist culture, where the communication values are not seen as universal. They are, rather, practiced and appreciated according to the social group and context (Triandis, 1980). A cultural understanding of the specific elements entailed in digital skills handling can help to develop better online classes and unleash creative activities that could enrich the students' formation according to their own values and behavior (Carrera et al., 2011). These results are consistent with the ones reported by Bao (2020), Leví-Orta et al. (2020), and Duarte (2020), who found that many students face technical operational obstacles that are difficult to resolve due to their lack of experience. They are also consistent with results reported by Esteve-Mon et al. (2020) and Plaza-de-la-Hoz et al. (2015) who stated that communication skills are usually perceived as more difficult than digital literacy skills and easier than problem solving.

Limitations
Limitations include the relatively small sample of students from a university in Mexico. There is also little ethnic and socioeconomic variability. Further, our study did not manipulate independent variables or the presence of the learning activities. Future studies should be conducted in larger, more diverse samples with an experimental design to test causal effects.

Implications for Future Research, Theory, and Practice
This study provides useful information on problems whose solutions can help to improve online educational programs. We have designed a scale that allows institutions to collect data about self-reported perceptions of digital competences (Fite et al., 2009). The collected data may be useful in planning activities for the reinforcement of digital competences, as they may identify difficulties and support specific pedagogy for online teaching/learning processes .
The measurement instrument provides information on the perceived difficulty of specific behaviors required to be performed by students in online classes, which reflect digital competencies (Fite et al., 2009;Gisbert et al., 2011) and presents evidence of predictive validity on the number of assignments successfully completed. Therefore, The Digital Competence Scale for Online Migration may help faculties to identify difficulty degrees of the diverse digitalized tasks required of the students in order to take online classes and successfully solve assignments, such as communicating with their teachers or solving problems with the platform functioning. The Digital Competence Scale for Online Migration can be used to generate information about how students perceive the difficulty of the behaviors required of them in online classes. Information about digital competences serves to determine readiness to move online or whether students need special support from their institution to successfully take online classes . Areas where skills are lower can be remediated and reinforced to avoid learning barriers (Bao, 2020).