Wednesday, September 28, 2016

Differentiating Between Research and Common Sense

If you’ve been involved in education policy and/or funding, you have no doubt heard the arguments about whether or not there is a connection between education funding and student outcomes. Education proponents cite data showing that states paying more per pupil on education tend to have higher student graduation rates and better student assessment results. People on the other side of the issue lambast the research (and often the people citing the research), indicating that it is all smoke and mirrors. Their bottom line is that there is no connection between education funding and student outcomes.

I am a researcher by title and by nature. I like to investigate the available data and see what conclusions I can come to. However as I’ve noted before, I understand that research cannot give us absolute truths or undeniable facts. Research provides indicators. Indicators show us patterns and suggest what might be happening based on what we can observe and measure. Research cannot prove that more money equals better outcomes.  Statistics can show that the two are correlated, and even that more money predicts better outcomes, but because we cannot control for the variety of factors that impact educational outcomes, it is very, very difficult to produce definitive proof of the causal relationship between education funding and student success.  

But let’s take a moment to really think about the debate. Let’s step back from the data and the arguments about particulars, and let’s talk big picture.  

If you have more resources, does it not follow that you have a better chance to be successful in whatever it is you are doing than if you had less resources? Does anyone truly believe that putting more money into the education system would have no effect, or even negative effects, on student graduation rates and assessment scores? Why would anyone believe such a thing?

Of course it is not that simple. The education funding skeptics (those who say funding amounts do not matter) point to efficiency and effectiveness, saying that schools and districts are mismanaging the money, that administrators are being paid too much, and that we are not getting the return on our educational investment that we should. They say the solution is not to spend more, but to crack down on the school administers and ensure that they are spending the money they’ve got more carefully.  

This is presented in the face of reports from the advocates for additional funding which consistently show that Kansas ranks high in education outcomes but only in the middle for education spending, suggesting that the state is more successful in terms of student outcomes than would be expected based on our funding level. So while we can assert the correlation between spending and outcomes, we also can assert that Kansas does better than can be attributed to spending alone.  Something about the Kansas education system allows us to have better outcomes for our students than other states that spend similar amounts.  

But even these arguments seem to create a false dichotomy. No education proponents ever argue that we should spend more and not also ensure we are spending as efficiently and effectively as possible. Of course we need to spend carefully and to look for opportunities to save and make those dollars stretch. But anyone on a tight budget can tell you that scrimping and saving only gets you so far. You have to have a minimum amount to work with in order to be effective, and if you don’t have that minimum amount, no scrimping and saving will help.

So back to my original point. Research, statistics, and analysis will never be able to prove, beyond the shadow of a doubt, that increasing education spending will lead to better student outcomes. But based on your own personal experiences, and your own common sense, do you truly need research to prove that to you?  Do you need someone to present incontrovertible evidence that having more resources increases your chance for success?   

Maybe it is time we focused a bit more on common sense and a bit less on what research can or cannot prove.  

Wednesday, September 14, 2016

Key Findings from the KASB Superintendents Survey Report

KASB recently released the 2016-17 KASB Superintendent Survey Report, which provides information reported by Kansas public school districts to the Kansas Association of School Boards related to the demographics, contracts, and salary and benefits of their superintendents from the 1995-96 school year through the 2015-16 school year. Over 90 percent of districts responded each year, with responses for many years nearing 100 percent.

The KASB Superintendent Survey changed format beginning with the 2016-17 school year. The report presents the data in the new format, and utilizes data formerly collected via the Administrators Survey.  
Below are some of the key findings from the report.  From 1995-96 (or earliest available year) to 2015-16:

  • The percent of districts with female superintendents has more than tripled (from 5% to 16%).
  • The average age for superintendents has consistently been over 50, but has shown a slight decrease since 2013-14.
  • The percent of districts reporting they had an interim superintendent went from 3.5% in 2010-11 to 2.9% in 2015-16.
  • The percent of districts reporting they had a superintendent who was KPERS retired went from 9.1% in 2011-12 to 11.4% in 2015-16.
  • The average number of years as a superintendent in any district has remained between 8 and 10 years, and the average number of years as a superintendent in the current district has remained between 4 and 6 years.
  • The average for total years in superintendents’ current contracts has remained very consistent at 2 years.
  • The average number of days in superintendents’ contracts has increased from approximately 255 days in 2006-07 to 272 in 2015-16.
  • Average superintendent salaries have increased from approximately $67,000 in 1995-16 to $103,000 in 2015-16, with an average annual increase of 2.2%.
  • Average board paid fringe benefits for superintendents have increased from $3,200 to $10,600, with an average annual increase of 6.4%.
  • Average salaries for female superintendents have been consistently higher than average salaries for male superintendents.
  • The increase in the average salary plus benefits for the smallest districts was from $60,000 to $97,000, with an average annual increase of 2.3%.
  • The increase in the average salary plus benefits for the largest districts was from $110,000 to $188,000, with an average annual increase of 2.8%.
  • Per pupil spending on superintendent salaries went from $138 per pupil in 1995-96 to $220 in 2015-16 overall, with smaller districts seeing larger per pupil amounts and smaller average annual increases than larger districts.
  • The percent of districts indicating their superintendent had a Bachelor’s degree increased from 89.2% in 2007-08 to 96.7% in 2015-16.
  • The percent of districts indicating their superintendent had a Master’s degree increased from 87.5% in 1999-00 to 95.9% in 2015-16.
  • The percent of districts indicating their superintendent had a Doctorate or Specialist degree decreased from 29.9% and 30.3% to 19.2% and 17.1%.
  • Across degrees, the percent of superintendents coming from Kansas postsecondary institutions has increased from an average around 70% in 1999-00 to around 85% in 2015-16.

In addition to the report, KASB released the following data resources, which allow users to drill down and filter by district, KASB Region, KSHSAA Class, KNEA Uniserv, High School League, and other factors:
  • Two Excel workbooks containing data for all years, the first containing information on superintendent demographics, contracts, salary, and benefits, and the other containing information on degrees and postsecondary institutions.
  • Two Interactive Tableau Tools with all available data, the first containing information on superintendent demographics, contracts, salary, and benefits, and the other containing information on degrees and postsecondary institutions.

These resources are made available to members only, and can be found at https://kasbresearch.org/member-data/.  If you need the password to access the members-only data page, or have questions on the report or the data, contact tcarter@kasb.org.    

Friday, September 9, 2016

Key Findings from the KASB Fees Survey Report

KASB recently released the 2016-17 KASB Fees Survey Report, which provides information reported by Kansas public school districts to the Kansas Association of School Boards related to the fees charged for activities, facilities usage, and other items and services from the 1995-96 school year through the 2015-16 school year. Over 90 percent of districts responded each year, with responses for many years nearing 100 percent.

The KASB Fees Survey changed format beginning with the 2016-17 school year. The report presents the data in the new format, and utilizes data formerly collected via the Calendar and Fees Surveys.  

Below are some of the key findings from the report.

  • From 2001-02 to 2015-16:
    • The percent of districts charging students for participation in extracurricular activities doubled from 15 percent to 30 percent.
    • The percent of districts reporting policies for reducing or waiving participation fees has increased from approximately 5 percent to 15 percent.
    • The average amount charged per event at the middle and high school levels has remained fairly consistent.
    • The percent of districts indicating that they offer camps, meals, and uniforms for free has decreased, providing laundry for free has remained fairly consistent, and providing transportation for free has increased.  
  • From 2002-03 to 2015-16:
    • Districts charging admission to athletic events increased from 73 percent to 83 percent.
    • Districts reporting activities income from gate receipts increased from 83 percent to 94 percent, from general fund transfers decreased from 73 percent to 61 percent, from vending machines decreased from 28 percent to 18 percent, and from concessions increased from 12 percent to 21 percent.
    • Per pupil income from bookstores decreased from $5 to $2, from vending machines decreased from $9 to $4, and from concessions increased from $17 to $20.
    • All fees show increases over time. This includes daycare, driver’s education, musical instruments, library and technology fees, per hour and per event facilities rental, shops and labs, textbooks and publications, uniforms, all-day kindergarten, and other miscellaneous items.
  • From 2007-08 to 2015-16:
    • Per student participation fees increased from $22 for middle school and $25 for high school to $28 and $34 respectively.
    • High school annual admission passes have gone from approximately $19 to $30.
    • Middle school annual admission passes have gone from $18 to $26.

In addition to the report, KASB released the following data resources, which allow users to drill down and filter by district, KASB Region, KSHSAA Class, KNEA Uniserv, High School League, and other factors:


Questions on the report and data can be sent to tcarter@kasb.org.    

Thursday, September 8, 2016

KASB's Report Card 2016 Methodology Notes

We have had some folks asking us about the methods and data used for the 2016 KASB State Education Report Card, so I wanted to provide some additional information on why we used the approach we did.


The Report Card represents an update to both the Kansas Educational Achievement Report Card 2015 and the Comparing Kansas series of reports produced from August 2015 through January 2016 (all the reports can be found at kasbresearch.org/publications). Along with providing the most recent years available for statistics used in earlier reports, the 2016 Report Card utilized additional measures and used a different method for determining Aspiration States.

In the earlier work, we used fourteen measures of student achievement and attainment:
  • Student Attainment
    • Freshman Graduation Rate
    • Cohort Graduation Rate
      • All Students
      • Economically Disadvantaged Students
      • Limited English Proficiency Students
      • Students with Disabilities
    • Percent of 18-25 year olds with a high school diploma
  • Student Achievement
    • Percent performing at or above “Basic” on the NAEP assessment
      • All Students
      • Students Eligible for the National School Lunch Program
      • Students Not Eligible for the National School Lunch Program
    • Percent performing at or above “Proficient” on the NAEP assessment
      • All Students
      • Students Eligible for the National School Lunch Program
      • Students Not Eligible for the National School Lunch Program
    • Percent meeting all four ACT benchmarks (adjusted for percent participation)
    • Average composite SAT score (adjusted for percent participation)

We used as many variables as we could because we knew a multiple measure approach (similar to a portfolio assessment approach) would yield better information about student outcomes than any one measure (subject to potential bias and measurement error) could. We included measures of both student achievement and attainment because we felt it was important to define student success not only in terms of test scores, but also in terms of how many students actually graduate from high school.

In addition, we included the overall statistics as well as any subpopulation statistics available because we felt it was important to look at how the students in the state were doing as a whole and how specific subgroups were doing. Some might say this means we “double-counted students,” however, it is important to clarify this is state-level aggregate data analysis, which is vastly different than student-level data analysis where concerns over including the same student’s results in multiple groups would be a concern.

In terms of the assessments used, we chose to use NAEP, ACT, and SAT because they are the only three K-12 measures collected and reported for every state. The NAEP has limitations based on the size of the sample used and based on recent research that has called into question how comparable its results are across states, but it is nonetheless the only measure available that provides comparisons in reading and math at the fourth and eighth grades.

Both ACT and SAT results were included because typically each state has a much higher participation rate in one exam over the other; however, all states have results for both exams. Research shows a state’s participation rate has a huge impact on its overall ACT and SAT results, with higher percent participation predicting lower average results statewide. Because of this, we devised a method of ranking states’ ACT and SAT outcomes on the amount by which each state deviated (above or below) the outcome predicted based on its percent participation (using the same linear regression model mentioned in the research linked above).

Taking all of these measures together, we looked for states that had better outcomes than Kansas on at least 8 of the 14 measures. Initially in August 2015 we found only five states met this criteria (New Hampshire, New Jersey, Massachusetts, Vermont, and Minnesota), meaning Kansas ranked sixth in the nation on the combination of these measures. We ran the analysis again with updated statistics in January 2016 and found that seven states outperformed us based on the same criteria (Indiana, Iowa, Massachusetts, Nebraska, New Hampshire, New Jersey, and Vermont), meaning Kansas had moved from sixth to eigth. We did it again in May 2016 and found the same seven states outperforming Kansas.

By August 2016, new data was available for many of the statistics we were using, but we also decided it was time to evaluate those statistics and decide if they were the right ones to be using. We decided to drop the Freshman Cohort Graduation Rate because it was no longer being reported by states. In addition, based on the goals outlined in the Rose Capacities and KSDE’s Kansans CAN initiative, we felt we needed to include outcomes related to postsecondary success. We chose the other two measures of educational attainment for 18 to 24-year-olds available from the U.S. Census Bureau, the percent with some college or higher and the percent with a four-year degree or higher.

The postsecondary measures chosen are not perfect, by any means. We would like to be able to report on the number of Kansas high school graduates enrolling in postsecondary institutions in any state, but this data is simply not available. We would also like to discuss postsecondary remediation rates, but these are likewise not available in a format comparable from state to state. So we are using the 18 to 24-year-old measures knowing there are issues, such as the fact it includes people who moved to Kansas after having obtained high school diplomas in other states, excludes Kansas high school graduates that enroll in out of state institutions, and so forth.

With the elimination of one measure and the addition of two, we were now up to 15 student outcome measures. These measures were organized into three categories: postsecondary, graduation, and assessments. Rather than looking for the number of states that outperformed Kansas on a majority of measures, our Associate Executive Director of Advocacy Mark Tallman wanted to be able to produce an overall rank so we could also see to what extent each of those states outperformed Kansas.

This is where the researcher and statistician in me had a bit of difficulty, as calculating an average of ranks and then ranking that average felt a bit like writing a book report on a bunch of other book reports tied together. But then I reminded myself of a key mantra I have been repeating to myself since college - research provides indicators, not facts. We acknowledge that these state-level statistics represent aggregate measures that can mask a lot of the things going on at the lower levels, such as at the county, district, school, classroom, and especially student levels. They are not perfect measures and they are not designed to produce perfect conclusions. We are simply trying to get an idea of where Kansas stands based on a bunch of aggregate measures considered together.

We averaged the ranks and then calculated a rank based on those averages. Kansas landed at number 10, with the following states in the top nine positions:
  1. New Hampshire
  2. Massachusetts
  3. Vermont
  4. New Jersey
  5. Nebraska
  6. Iowa
  7. Minnesota
  8. Indiana
  9. North Dakota

However, after additional discussion, we decided we should be using a weighted average. The reason we decided on a weighted average was largely due to the statistics where subpopulations and multiple measures were used, such as the NAEP statistics and the Cohort Graduation Rate measures. Because we had six statistics for NAEP, that meant that NAEP overall would have six times the impact the ACT results would, for example. In addition, we wanted to make it so each of the three types of indicators (postsecondary, graduation, and assessments) got equal weighting.

In the end, we came up with a ranking that weighed each of the measures as follows:


As you can see, the postsecondary measures in blue, the assessment measures in green, and the graduation measures in orange each make up one-third of the the total. In addition, the six NAEP measures taken together have the same weight as the ACT measure and as the SAT measure.

As it turns out, the weighting did not impact Kansas’ rank at all. We were still in tenth place. As for the states ahead of us, two dropped off the list, two more were added, and the remaining states move around a little bit:

  1. New Hampshire (no change)
  2. Massachusetts (no change)
  3. New Jersey (up from #4)
  4. Iowa (up from #6)
  5. Nebraska (no change)
  6. Vermont (down from #3)
  7. Illinois (new - replaced either Indiana or Minnesota)
  8. North Dakota (up from #9)
  9. Connecticut (new - replaced either Indiana or Minnesota)

So, to summarize, KASB initially created a method for identifying states that perform better than we do in terms of student outcomes, and found that only five states fit this criteria, putting us as number 6. Later we ran the same analysis with updated data and found Kansas had fallen to number 8. Then we decided to revise our methods to take new factors into consideration, to produce an overall ranking, and to base that ranking on an average that utilized weightings for the fifteen factors included, and we found Kansas to be at number 10.

Seeing the general downward trend based on the earlier rankings, KASB also did something this time around we hadn’t done before. We looked at change over time.Though we didn’t use this information as part of the Aspiration States identification, we worked to note how many states moved ahead of or fell behind Kansas on the individual measures, utilizing the earliest data available going back to 2005. Unfortunately, in most cases there were more states moving ahead than there were falling behind.

The trend data allowed us to expand the conclusions that we could draw from the comparisons. Previously we could assert that the data suggests Kansas student outcomes were high despite having a funding rank somewhere in the middle. By looking at the historical data and determining how many states moved ahead of Kansas and how many fell behind, we could also say the data suggests that Kansas is losing the lead it has on other states, or to put it another way if the trends suggested by this analysis continue, it is likely that Kansas’ ranking in terms of education outcomes will continue to fall.

KASB feels the new method for determining which states we should be looking to for ideas on improving educational outcomes in Kansas is sound, and is an improvement on what we have used in the past.


Tuesday, September 6, 2016

Key Findings from the KASB Calendar Survey Report

KASB recently released the 2016-17 KASB Calendar Survey Report, which provides information reported by Kansas Public School Districts to the Kansas Association of School Boards related to the School Calendar from the 1995-96 school year through the 2015-16 school year.  Over 90% of districts responded each year, with responses for many years nearing 100%.

The KASB Calendar Survey changed format beginning with the 2016-17 school year.  The report presents the data in the new format, and utilizes data formerly collected via the Calendar, Teacher Contract Details, and Negotiation Settlement Surveys.

Below are some of the key findings from the report.
  • From 1995-96 to 2015-16,
    • The actual calendar day for the start and end of school for students and teachers, along with the start and end dates for winter and spring breaks, have remained very consistent.
    • There has been a slight increase overall in the total teacher days before school, the total teacher in-service days, teacher work/duty days, and in the length of winter break (since 2001-02).  However, there has been a slight decrease in the total teacher days after school. 
    • The average number of days per teacher has decreased from 187 days to 179 days, while the number of days for students fell from 179 days to 169 days.
    • The length of the school day increased for teachers from 7 hours and 50 minutes to 8 hours and total time for students increased from 7 hours to 7 hours and 20 minutes.
    • Arrival times have gotten earlier for both teachers and students.
    • High schools start earlier than middle schools, which in turn start earlier than elementary schools.  
    • There is an overall trend for later departure times for both teachers and students.  
    • Teacher lunch times have remained fairly consistent at just under 30 minutes a day, while teacher prep time and teacher time spent outside scheduled work hours have increased across all school levels.
    • The number of middle school blocks went from just over 7.5 to just under 7.5 blocks or periods.  
    • Districts including a zero hour decreased from slightly over 15% to slightly under 15%.  
  • Combining the information on dates and times, we can estimate that total teacher time per year went from 1,464 hours and 50 minutes in 1995-96 to 1,432 hours in 2015-16, and total student time went from 1,253 hours in 1995-96 to 1,239 hours and 20 minutes in 2015-16.
  • Less than 15% of districts used Block scheduling in 1995-96, over 45% reported using this schedule format as of 2002-03, and about 20% as of 2015-16.  
In addition to the report, KASB released The following data resources, which allow users to drill down and filter by district, KASB Region, KSHSAA Class, KNEA Uniserv, High School League, and other factors:
Questions on the report and data can be sent to tcarter@kasb.org.    



Monday, June 27, 2016

Center for Public Education Study Finds the Path to Career Readiness

The following post presents research or analyses from outside KASB and is presented for information purposes.  KASB neither endorses nor refutes the conclusions or recommendations contained herein.
for your consideration.gif

The National School Board Association's Center for Public Education recently released third installment in a series of reports about non-college goers entitled "Path Least Taken III: RIgor and focus in high school pays dividends in the future." The following are excerpts from their press release. You can find the full report here.


Today’s high school graduates can be ready for both college and the workplace with the right preparation and credentials. Higher education confers many benefits, including a better opportunity of obtaining good jobs with high wages, however, it is not the only path to success.

CPE’s study finds that well-prepared high school graduates can achieve similar and in some cases greater success than college goers. The winning combination is what CPE calls “high credentials,” a mix of academic knowledge and job specific or technical skills developed in high school plus a professional certificate or license.
Earlier installments in the Path Least Taken series found the overall group of high school graduates who did not go to college face the dimmest economic and social prospects at age 26 compared to those who did. The new report shows that students with high credentials, however, achieved higher economic and social outcomes than two-year degree holders and students who don’t complete their college education, and second only to outcomes for four-year degree holders. High credentialed non-college goers earned 39 percent more than non-credentialed non-college goers, and 21 percent more than 2-year degree holders at age 26 ($18.71 per hour compared to $13.42 and $15.43 respectively). The high credentials group trailed the 4-year degree graduates in hourly wage by only 3.4 percent.
The Path Least Taken identifies high credentials as including:
  • A C+ grade point average or above; and
  • A high school diploma;
  • Algebra II and advanced science;
  • An occupational concentration defined as three or more courses in a single labor market area culminating in a professional license or certificate.
“All students must be equipped with the knowledge and skills to reach their full potential,” said Thomas J. Gentzel, executive director of the National School Boards Association. “To better prepare the next generation, school leaders have to look at enhancing educational opportunities for all students, both college bound and career bound, to ensure future readiness for both.”
Using longitudinal data for students in the Class of 2004, CPE found that eight years after graduating from high school, a mere 13 percent had not gone on to either a two- or four-year college. Of the students who entered colleges, however, less than half emerged with a degree. Often burdened with student debt, these non-completers also find themselves with job prospects only slightly better than had they not gone at all. However, CPE’s analysis reveals that high credentials earned in high school makes a big difference for them, too, in terms of higher wages and full-time employment.
“With so many high school graduates going on to college, the focus for high schools has in large part been on college readiness, but at the expense of learning what makes graduates career-ready,” said Patte Barth, director of the Center for Public Education. “Just as high-level academic courses benefit all students whether they go to college or not, our analyses further show that career education in high school contributes significantly to the future success of all young adults.”