lecticalive for Admissions

Unlimited admissions assessments—on demand

Coming in 2018!

LecticaLive for Admissions examines the fit between your curriculum and the complexity level of students' reasoning skills. It delivers the LRJA (Lectical Reflective Judgment Assessment). The LRJA is a customizable assessment that measures a strong predictor of academic performance—students' ability to use their knowledge to tackle complex real-world issues characterized by multiple variables, diverse perspectives, and competing evidence.

Using LecticaLive for Admissions is easy. We begin with an analysis of the complexity level of curricular content. Then, a subscription manager uploads applicant information and students are invited to take their assessment. Assessments can be taken in your own assessment center or through contracting with an external assessment center.

Features

Customized to fit the discipline or domain of an academic program and the complexity level of its curriculum.

Scored with CLAS (the revolutionary Computerized Lectical Assessment System).

Lectical Scores delivered in real time.

Unlimited admissions access.

A simple dashboard showing how well candidates' thinking skills are likely to stand up to the requirements of curricula.

Requirements

For each new curriculum, we must conduct a Curriculum Complexity Analysis.

LecticaFirst is appropriate for college admissions at any level.

Candidates should have good or very good English skills. 

 

Upgrades

Human review of individual scores

Human review of scores for candidates selected for final consideration (includes an assessment of logical coherence) 

Full assessment report and learning suggestions (recommended for students who are admitted)

Assessment debriefs, group or individual (recommended for students who are admitted)


The LecticaLive for Admissions dashboard

The figure below shows how easy it is to identify the candidates who are most likely to be a good fit—from the perspective of the complexity of their reasoning skills—for a particular curriculum. In the figure, eight candidates are represented by circles of different colors. The size of the circle represents the range in which a candidate's "true score" is likely to fall. The darker numbers in the score scale on the top represent the desirable score range. Individuals whose scores are represented with teal circles are "in the zone." The gap between thier scores and the complexity level of the curriculum is "just right." Individuals represented by yellow circles have borderline scores, and individuals represented with red circles are out of range. 

When interpreting results, it's important to keep in mind that the scores of college students increase, on average, from 5 to 10 points per year. When you admit a student who is performing 10 points below the range set for your curriculum, you are selecting a student who is one or more years behind requirements and who is likely to be on a flatter developmental trajectory than students with higher scores.

You'll notice that students can be underqualified or overqualified for a particular curriculum. Students who are underqualified are likely to struggle with understanding the content of your curriculum. Students who are overqualified may not feel adequately challenged. In the latter case, we recommend frequent check-ins and readily available opportunities for enrichment.


Curriculum Complexity Analysis

The Curriculum Complexity Analysis involves an examination of the average complexity demands of curricula in a particular domain/discipline and educational level. The table below shows examples of ranges for year 1 programs in different disciplines. Please note: The ranges shown are not representative of programs in the target discipline.

Discipline Year 1 Lectical range Degree
Philosophy 1080–1100 BA
  1120–1140 MA
  1160–1180 PhD
Law 1090–1110 JD
  1130–1150 LLM
  1170–1190 SJD
Neuroscience 1060–1080 BA
  1110–1120 MA
  1140–1160 PhD
Education 1055–1075 BA/BS
  1095–1115 MA/MS
  1135-1155 EdD/PhD

A note on cheating

Lectical Assessments are taken online in secure sessions. If you conduct the assessments under observation, it is virtually impossible for test-takers to cheat. Cheating is difficult for a number of reasons:

  • assessment questions call for written judgments and explanations;
  • test takers cannot learn in advance which test form they will be presented with; and
  • attempts to cheat most often result in irregularities that cause CLAS to deliver a particular kind of "low confidence" score.

If you have concerns about cheating, we recommend that you consider retesting final candidates under observation.

The complete elimination of cheating can only be accomplished with the frequent low-stakes assessment of students—tracking growth over long periods. Not only is cheating reduced when stakes are reduced, but incidents of cheating are likely to stand out from a students' overall developmental trajectory.


Contact us for admissions estimate...

 

Selected funders

IES (US Department of Education)

The Spencer Foundation

NIH

Dr. Sharon Solloway

The Simpson Foundation

The Leopold Foundation

Donor list

Selected clients

Glastonbury School District, CT

The Ross School

Rainbow Community School

The Study School

Long Trail School

The US Naval Academy

The City of Edmonton, Alberta

The US Federal Government

Advisory Board

Kurt Fischer, Ph.D. Harvard Graduate School of Education, Emeritus

Antonio Battro, MD, Ph.D., One Laptop Per Child

Marc Schwartz, Ph.D. and former high school teacher, University of Texas at Arlington

Mary Helen Immordino-Yang, Ed.D., University of Southern California

Willis Overton, Ph.D., Temple University, Emeritus