Skip to main content
Get your brand new Wikispaces Classroom now
and do "back to school" in style.
Pages and Files
Be sure to scroll down the left-side navigation to see all of the sections and wiki pages!
NEW AND NOTEWORTHY
**SUMMER 2012 INSTITUTE**
MATH TASK LIBRARY
ANNOUNCEMENT & EVENTS
Calendar & Event Submissions
Highline Community College
Everett Community College
Lower Columbia College
North Seattle Community College
Northwest Indian College
Spokane Falls Community College
Cascadia Community College
WAMAP and Other Online Resources
Game Principles & Math Learning
Data_Research Issues (SAI)
Open Course Library Work
Student Attributes Work
Student Refresher Courses
Assorted Web Links
AMATYC 2010 Report
2011-12 ACADEMIC YEAR EVENTS
WINTER 2012 INSTITUTE
SUMMER 2011 INSTITUTE
Year 2 (2010-11) College Reports
**Final Team Assignments**
2010-11 PROFESSIONAL DEVELOPMENT:
WIDE World TfU Online Course
Winter 2011 Project Gathering
SUMMER INSTITUTE 2010
1. I Used to think.../Now I think...
2. Noteworthy.../Please consider...
Action Plans/Data Reactions**
1. Action Plan Year 1
2. RPM SAI Data
Resources for Working with Wikis
Expectations of Project Colleges
Brief Descriptions of College Projects
Theory of Change Principles
Other Funding Opportunities
Data_Research Issues (SAI)
The data team leads had a very productive
Elluminate session on 10/21.
Thank you all for joining in. Our focus was on the SAI database with Ann Paulson as the guest presenter sharing her knowledge of the SAI database. The benefits of the database for RPM is that it offers a relatively standardized set of data across the RPM local projects. It will not only help us collect and report date to our funders but can serve to provide each project site with data to measure improvements as each site moves forward toward its individual project's aims.
We started the session with a specific charge:
What sort of base measure could we collectively use to show improvement, and substantial progress in our math departments?
By the end of the session, we landed on the following measures based on the call’s notes but feel free to clarify and correct if that’s not your understanding:
#1: What percentage of students earn a pre-college math point in the year they attempt pre-college math? (not limited to a fall cohort)
#2: For students who start in the fall and begin math in level 1-3 that first year, how many students make substantive gain (two or more points) by the end of the year?
#3: For students who start in the fall, and begin math in level 4 that first year, how many earn their quant point by the end of the year? (The benefit here of using fall is you have one complete year)
It also sounded as if we decided that splitting this out demographically would reduce the 'n' too much, so Ann agreed to work with the projects on this by request.
Some other topics raised and/or issues to further explore:
Mickey raised the idea of value-added measures. RPM could get Accuplacer/COMPASS scores (convert to standard scores). May help identify which students are showing the most progress. However, there’s potentially a lot variation with placement school-to-school. IR representatives can help RPM get placement score data.
Some in the session also discussed how demographic variables can be proxies for math preparation. There's a zip code variable in SAI for example and this may give a good indication of where students are coming from. Race and other demographic variables as well. Are they stable over years? Where students are entering into the curriculum may be an indication of a broader trend.
Issues of number of attempts: Carmen’s data is looking at ‘number of attempts’ (included below). Carmen shows students who got points and how many times it took them to get that point.
: Address the issue of underestimating the problem: Merge Carmon’s worksheets by level who earned a point and who didn’t. There is an attempts variable. We need to know about those students who don’t get points.
Taking specific population data into account: For example, under 45 credit programs and certificates may not have a quantitative / math course requirement.
This may be worth taking into account when running your college's data.
On our behalf, Ann’s been discussing and looking at substantial progress but is there something in between as well? A common way of measuring substantial progress is passing a course on a first attempt (1:1), but could we consider looking at the percent of students who start at the lowest level and make gains to 1:2 for example.
Jeff Lucas shared an interest in measuring student withdrawals and others agreed this was a good idea. We can explore this more with Jeff and share his team's work on this.
RPM SAI Data Sept 2010.docx
As the data group I'd like to generate some discussion around your insights and growing understanding of the SAI database: what additional ideas does your team have about how this data can be useful to your project? Are there data you would like disaggregated? Are there target groups within the data such as level 1 students or a particular ethnic group etc. you'd like to know more about? How has IR used and shared this data on your campus? What other questions do you have about the data?
One idea that came up frequently at the Institute is this notion of identifying and tracking a gross measure—across the teams – such as 'completion' which Ann feels would be easy to do. For example, we could look at and compare 09-10, 10-11, 11-12 completion data and hopefully see an upward trend. Generally the SAI data for completion has been steady from year-to-year (69%, 70%,… etc.) so change from year-to-year of completion data within the RPM project site colleges could be useful information for one aspect of our broader evaluation efforts.
You all also asked about the gross measure running across all RPM projects, and included in the SBCTC's proposal to the Gates Foundation. Here's what I've found from the proposal:
To increase the overall pre-college math achievement gain at participating colleges by 15% and the substantial gain rate by 10% over the 3-year period of the grant.
In 2007-08, across the system 21,640 transfer-intent students (70% of those attempting) made a pre-college math achievement gain; 12,366 (44%) made a substantial gain (2 levels or college math).
Year 2 Anticipated Progress
At the participating colleges increase overall pre-college math achievement gain by 10% (382 students) and substantial gain by 5% (109 students)
3 Anticipated Progress
participating colleges increase overall pre-college math achievement gain by 15% (630 additional students) and substantial gain by 10% (229 students)
Comment from Helen Burn posted Sept 24, 2010:
I'm having a hard time recreating the numbers for "Year 2 Anticipated Progress." Specifically, the attachment from Annie Paulsen includes 2007-08 precollege math attempts (Study Question 2A, p 6) , but I don't see a table showing the percentage of 2007-08 who attained at least one precollege math course. Study Question 3A and 3B (page 15 and 16) shows attainment for 2006-07 and 2008-09 but not 2007-08. Since the percentage of students attaining precollege math points was stable in the two years shown (roughly 70%), my calculation below shows that a 10% achievement gain is closer to 515 students.
A78 # Students who attempted Precollege Math (From Study Question 2A, p. 6)
Spokane Falls 2377 (this number seems big to me)
Lower Columbia 601
North Seattle 548
NWIC not available in SAI
70% of total is 5151, so 10% would be about 515 students.
I think it's best to look at the colleges separately as well as to aggregate the college.
Example of Local Transcript Study Focused on Math Transitions (High School to College)
This study (see PowerPoint slides) was conducted by the WSU Social and Economics Research Center for the North King County Transition Math Project, aka Shoreline CC and Shoreline SD.
Sept 06 Shoreline Phase 2 V3.ppt
help on how to format text
Turn off "Getting Started"