This paper explores a method to support instructors in assessing cognitive skills in their course, designed to enable aggregation of data across an institution. A rubric authoring tool, "BASICS" (Building Assessment Scaffolds for Intellectual Cognitive Skills) was built as part of the Queen's University Learning Outcomes Assessment (LOA) Project. It provides a workflow for assessment choices and generates an assessment rubric that can be tailored to individual needs based on user input. The dimensions and criteria in BASICS were adapted from the Valid Assessment of Learning in Undergraduate Education (VALUE) rubrics, and drew on annotations from over 900 work samples from the LOA project. This paper summarizes the development of the tool, and presents initial reliability and validity data from a pilot study. The pilot found that the BASICS developed rubric was consistent for the assessment of critical thinking and problem solving. The pilot compared assessment data derived from course Teaching Assistants with that of trained Research Assistants. Analysis found moderate intraclass correlation coefficients between the BASICS rubric and corresponding VALUE rubric dimensions, suggesting that the BASICS rubric aligned with the VALUE criteria. Preliminary findings suggest that BASICS is an effective tool for instructors to author rubrics, tailored to their own specifications for assessment of cognitive skills in a course. It is also promising as a method for aggregation of data across the institution. Researchers are conducting further investigation to evaluate the reliability of BASICS rubrics over multiple work samples from a range of disciplinary contexts.