Work History

Work History
Sep 2010 - Present

Senior Director, Assessment Services

GED Testing Service

Responsible for directing the design and development of the next-generation GED assessment, in collaboration with external partner organizations.  The next-generation test is the core product of the GED Testing Service on which all other programs and services are based.  In addition, it is the cornerstone of the organization’s 21st Century Initiative, a program which will guide the direction of the GED Testing Service for the next decade.  1.  Serve as the primary contact for the GEDTS on all assessment partner-related issues related to assessment development.2.  Manage the relationship between the GEDTS, internal assessment development and psychometric staff members, and the GED assessment partners’ staff members.3.  Collaborate with and provide direction to the assessment partners in designing and developing the GED assessment, including determination of use of test scores, determination of content coverage and assessment model, design of test blueprints, design of innovative item types, and development of items for administration in a computer-based testing environment.

Jan 2004 - Sep 2010

Manager, Psychometric and Research Services

Pearson

Coach and supervise team of fifteen research scientists and research associates. Counsel internal and external customers in areas, such as validity theory, reliability, item response theory, standard setting, test equating, test construction, item analysis, research design, staffing/resource planning, staff development, quality control procedures, psychometric requirement specifications, and master scheduling. Serve as expert on psychometric and measurement matters. Assess customer needs and present technical solutions in response to requests for proposals (RFPs). Liaise between PRS (Psychometric and Research Services) and other departments, including Content Support Services, Program Management, Scoring and Reporting, Information Technology, and Assessment and Information Quality. Organize, lead, and facilitate data review meetings. Aid customers with development of documentation for NCLB (No Child Left Behind) requirements. Contribute to development of customer training videos.

Managed and performed analyses associated with technical and measurement requirements for six assessment programs, including: Hawaii State Assessment, Hawaii State Alternate Assessment, Arkansas Augmented Benchmark Exam, Washington Language Proficiency Test, Puerto Rico Pruebas Puertorriquenas de Aprovechamiento Academico, and Arizona's Instrument to Measure Standards.

Designed and facilitated 34 standard-setting workshops spanning 15 states and supporting 2 catalog products.

Boosted operational, psychometric, and management capacities by identifying and supporting corporate wide continuous process improvement initiatives.

Improved quality by creating quality control checklist to facilitate workflow monitoring for all major psychometric deliverables, including item analysis, equating, standard setting, data review, technical reporting, and test construction. Also, devised requirements specifications for psychometric analyses that improved quality of outputs.

Devised communication protocol between and within departments, which involved clearly defining roles and responsibilities of each functional group for various processes.

Increased developers' productivity by supervising and leading the development of new test construction/test map tool. Test construction tool automatically produced statistical reports of test construction results along with output file of critical outputs.

Earned ‘Revere' award for oversight of psychometric work on Canada's National Test in 2006.

May 2000 - Dec 2003

Psychometrician

Harcourt Assessment

Planned, managed, and conducted psychometric analyses. Drafted reports. Coordinated and facilitated data review and standard-setting meetings. Consulted with customers on psychometric and measurement issues. Developed and conducted internal training sessions regarding assessment and measurement theory, psychometric analysis, equating, and score interpretation. Led pre- and post-test workshops and educational conferences.

Selected by Vice President to serve as psychometric spokesperson role in data review video produced for Texas Education Agency.

Delivered consultation services and managed activities for large-scale assessment programs, including Mississippi Subject Area Testing Program, Hawaii State Assessment, Nevada CRT and High School Proficiency Exam, and Virginia Standards of Learning.

Oversaw psychometric activities and supported many other assessment programs, including programs for California, South Dakota, Oklahoma, New Mexico, Michigan, Florida, and Canada.

Supported catalog programs for The New Standards Reference Exam and Metropolitan Achievement Test, 8th edition.

Education

Education
1996 - 2000

Ph.D.

University of Pittsburgh
1996 - 1998

M.A.

University of Pittsburgh
1994 - 1996

B.A.

The University of Texas at San Antonio

Summary

* Expertise in design and development of various types of large-scale educational assessments, including summative, formative, assessments for ELL (English Language Learners), and alternate assessments. *Adept at leading and contributing to all aspects of the assessment process, including standards development, alignment, pilot testing, field testing, test construction, item analysis, calibration, equating, and scaling, standard setting, and score reporting. *Extensive knowledge of current K-12 assessment market, including state, district, and federal-level requirements and standards. *Led numerous optimization initiatives, including Lean Manufacturing and Six Sigma-based programs. *Outstanding communication and leadership skills; history of creating collaborative and meaningful relationships between departments and delivering critical projects on time.

Interest

Assessment Design Test Construction EquatingItem Analysis & Reporting Standard Setting tem Response Theory Process Improvement Risk Management Project LeadershipRegulatory Compliance Team Training & Supervision Relationship BuildingCommunication Presentation Skills Performance Management

Papers and Publications

(Published under the name, T.L. Cerrillo, in early career.)

Davis, J., & Gardner, T. L. (2004, April). A comparison of paper-based and computer-based administrations of a high-stakes, high school graduation test. Paper presented at the annual meeting of the American Educational Research Association, San Diego, CA.

Cerrillo (Gardner), T. L., & Lane, S. (2003, April). Factors influencing DIF on a large-scale mathematics assessment. Paper presented at the annual meeting of the National Council of Measurement in Education, Chicago, IL.

Cerrillo, T. L., Hansen, M. A., Parke, C. S., Lane, S., & Scott, K. (2000, April). The relationship between MSPAP and science classroom instruction and assessment materials. Paper presented at the annual meeting of the National Council of Measurement in Education, New Orleans, LA.

Hansen, M. A., Cerrillo, T. L., Lane, S., Paluda, J., Parke, C. S., & Van den Heuvel, J. R. (2000, April). The relationship between MSPAP and social studies instruction and assessment materials. Paper presented at the annual meeting of the National Council of Measurement in Education, New Orleans, LA.

Harwell, M., Cerrillo, T. L., & Hansen, M. (2000, April). The effects of different methods of handling missing data on models of student persistence. Paper presented at the annual meeting of the American Educational Research Association, New Orleans, LA.

Gatti, G., Al-Subahi, A., Cerrillo, T. L., Baughman, M., & Harwell, M. (1999, April). In harm's way: Parametric tests and assumption violations. Paper presented at the annual meeting of the American Educational Research Association, Montreal, Canada.

Lane, S., Cerrillo, T. L., Ventrice, J. A., Parke, C. S., & Stone, C. A. (1999, April). Impact of the Maryland School Performance Assessment Program (MSPAP): Evidence from the principal, teacher and student questionnaires (reading, writing, and science). Paper presented at the annual meeting of the National Council of Measurement in Education, Montreal, Canada.

Parke, C. S., Cerrillo, T. L., Levenson, J., O'Mara, J., Hansen, M. A., & Lane, S. (1999, April). Impact of the Maryland School Performance Assessment Program (MSPAP): Evidence from classroom instruction and assessment activities (reading and writing). Paper presented at the annual meeting of the National Council of Measurement in Education. Montreal, Canada.

Presentations

Gardner, T. L. (2010, April). Validating educational assessments. Discussant commentary presented at the annual meeting of the American Educational Research Association, Denver, CO.

Cerrillo, T. L. (2003, June). Establishing academic achievement levels on the Hawaii State Assessment: Integrating new grades into an existing program. Presented at the annual meeting of the Large Scale

Assessment Conference, San Antonio, TX.

Cerrillo, T. L. (2002, February). Establishing proficiency levels on the HCPS II. Paper presented at the annual meeting of the Hawaii Educational Research Association, Honolulu, HI.

Cerrillo, T. L. (2001, April). Careers in educational measurement: Advice from the professionals. Presented at the annual meeting of the National Council of Measurement in Education, Seattle, WA.

Cerrillo, T. L. (2001, March). Standard setting on the HCPS II Statewide Assessment. Presented at the Hawaii Assessment Conference, Honolulu, HI.

Cerrillo, T. L. (2001, February). Setting performance standards in Hawaii. Presented at the annual meeting of the Hawaii Educational Research Association, Honolulu, HI.

Computer Skills

Winsteps, Parscale, SPSS, SAS, MS Word, MS Excel, MS PowerPoint