Skip to main navigation menu Skip to main content Skip to site footer

Articles

Vol. 2 No. 1 (2015)

Assessment Tests for Digital Skills: A Tool for Learning Outcomes and University Accreditation

DOI
https://doi.org/10.15377/2409-9848.2015.02.01.1
Submitted
August 13, 2015
Published
2021-11-24

Abstract

An important question asked by academicians is whether students use Internet languages constructively to further themselves in their chosen field of study while in college, or merely use them for social networking and entertainment.

The iSkills Test is a possible tool for assessment of students’ skills in using digital technology on the Internet. This paper seeks to investigate the test’s usefulness in helping to answer the question stated above and in the process, add positively to the reaffirmation of accredited universities where these students are enrolled.

An original method has been shown to allow instructors/coaches and institutions to assess the effectiveness of their programs through the use of this test, even though the grading scheme has been changed between the initial and final assessment tests (in one term). The grading code of the iCritical Thinking Certification test (now updated as the iSkills test) has been found, at least for the present.

After two years of testing data at the University of Miami, the University has been provided with three major strengths and three major weaknesses of the students. The undergraduate strengths are (1) shopping, (2) following directions, (3) using information ethically. The students’ weaknesses are (1) selecting resources, (2) researching, (3) knowing or understanding what they find. It is apparent that the areas of weaknesses are areas of opportunity for the University to improve their students’ digital skills. Some useful suggestions have been developed to help students improve their critical thinking abilities and thus perform better in the certification test.

The iSkills tests involved organization of information, development of a search strategy, creating a slide, summarize researched information, creating a visual representation and constructing an advanced search. These tasks involved a variety of skill sets and they mimic real world tasks. This test is recommended for the assessment by universities of their students’ improvement in digital skills development while on the Internet, across the disciplines. The tests are used to help convince university accreditation bodies like Southern Association of Colleges and Schools in the U.S.A. International acceptance of the test is recommended since the use of the Internet is global, and digital skills development is an undeniably important asset.

References

  1. Armstrong, Chris, Lonsdale R, Nicholas D. CIBER SuperBook Project 24 Oct. 2007, University College London. http://www.ucl.ac.uk/slais/research/ciber/superbook/, retrieved March 17, 2012.
  2. Hannigan, Gale G. Users’ Awareness of Electronic Books is Limited. Evidence Based Library and Information Practice 2.2 (2007): 104-106. , retrieved March 17, 2012.
  3. Rowland I, Nicholas D, Jamali HR, Huntington P. What do Faculty and Students Really Think about E-books? Aslib Proceedings: New Information Perspectives 59.6 (2007): 489-511. http://dx.doi.org/10.1108/00012530710839588
  4. California State University. CSU e-Book Pilot Project Final Report, California State University 2002, retrieved March 17; 2012.
  5. Ennis RH. Critical Thinking Assessment in Fasko, Critical Thinking and Reasoning: Current Research, Theory, and Practice, 2003. ISBN 978-1572734609.
  6. Raiskums BW. An Analysis of the Concept Criticality in Adult Education 2008.
  7. Scriven M, Paul RW. Critical Thinking as Defined by the National Council for Excellence in Critical Thinking 1987.
  8. Facione, Peter A. Critical Thinking: What It is and Why It Counts. Insightassessment.com, retrieved March 16; 2012.
  9. American Library Association, Editors: Duke LM, Asher AD. College Libraries and Student Culture: What We Now Know. ISBN-13: 978-0-8389-1116-7, ALA Editions 2012.
  10. Free Government Information, Ethnographic Methods of Student Study Methods. http://freegovinfo.info/node/3424, retrieved Aug 23, 2011.
  11. The New Media Consortium 2005. A global imperative: The report of the 21st century literacy summit. Austin TX. Retrieved December 19; 2009 from http://www.nmc.org/pdf/Global_Imperative.pdf.
  12. Papoulis, Pillai, Probability, Random Variables, and Stochastic Processes, 4th Edition, 2002. McGrawHill New York, NY, USA.
  13. ETS, http://www.ETS.org, retrieved Dec 2009.
  14. Kennedy G, Krause KL, Judd T and Churchward A. Gray K. (2006). First year students’ experiences with technology: Are they really digital natives? Melbourne: University of Melbourne. Retrieved December 19; 2009 from http://www.bmu.unimelb.edu.au/research/munatives/natives_ report2006.pdf.
  15. Lenhart A, Madden M 2005. Teen content creators and consumers. Washington, DC: Pew Internet & American Life Project. Retrieved December 19; 2009 from http://www.pewinternet.org/pdfs/PIP_Teens_Content_Creatio n.pdf.
  16. Fitzgerald MA. Making the leap from high school to college. Knowledge quest 2004; 32(4): 19-24.
  17. Andretta S. Information literacy: A practitioner’s guide. Oxford: Chandos 2005. http://dx.doi.org/10.1533/9781780630755
  18. University of Miami (2008, Feb. 1). Communication across the curriculum: Enhancing student learning by extending the scope of information acquisition and communication of knowledge in the undergraduate experience, Coral Gables, FL.: Retrieved December 12, 2009, from http://www7.miami.edu/um_global_static_files/sacs_departm ent_files/QEP%20FINAL.pdf