New Jersey School Scores

What a comparison between SO Middle and Maplewood Middle. Maplewood is a disgrace at 23.58 with SO at 72.1. Clinton did the best of the elementary schools. Do they still have the most ESL classes and does that reflect on the scoring?


Here's the raw data they used to calculate the rankings/scores. The differences between MMS and SOMS (including one field that had a strange character in their datafile) are in bold.



SCHOOL_NAME

Maplewood MS ----------- 

South Orange MS

StudentGroup Schoolwide Schoolwide
CutOffScoreTargetedSupImp      N N
Targeted_LowPerf No No
ELAProf_MetTarget Met Target Met Target
MathProf_MetTarget Met Target† Met Target
Grad4YR_MetTarget N N
Grad5YR_MetTarget N N
ELAGrowth_MetStandard Not Met Met Standard
MathGrowth_MetStandard Met Standard Exceeds Standard
ELP_MetTarget ** N
CA_MetAverage Met Met
Targeted_ConsUnderPerf No No
SummativeRating 23.58 72.1
SummativeScore 33.63 63.65

SOMA has mediocre results as usual.  Merit pay anyone?


lord_pabulum said:
SOMA has mediocre results as usual.  Merit pay anyone?

I'm waiting for the translation of these numbers.  The numbers by themselves without context and comparisons are worse that useless.


tjohn said:


lord_pabulum said:
SOMA has mediocre results as usual.  Merit pay anyone?
I'm waiting for the translation of these numbers.  The numbers by themselves without context and comparisons are worse that useless.

 Except for comparison to other schools: 

 CHS - 37.5 percentile

 West Orange - 39.83

 Montclair - 37.54


cramer said:
 Except for comparison to other schools: 
 CHS - 37.5 percentile
 West Orange - 39.83
 Montclair - 37.54

 Not much difference.


tjohn said:


cramer said:
 Except for comparison to other schools: 
 CHS - 37.5 percentile
 West Orange - 39.83
 Montclair - 37.54
 Not much difference.

 Yup. Those are the two school districts with which SOMA is most often compared, at least in terms of demographics. 


These numbers REALLY shouldn't be used for anything. The state themselves said last year they were really intended solely to help the department of education determine the lowest 5% of schools, and they're VERY much based on PARCC, where opt-out scores count as zeros. Best ignored. All the state report card data came out yesterday and I think we can find better metrics to compare our schools by, IF we're going to insist on even doing so.


tjohn said:


lord_pabulum said:
SOMA has mediocre results as usual.  Merit pay anyone?
I'm waiting for the translation of these numbers.  The numbers by themselves without context and comparisons are worse that useless.

 I'm going by the ratings and scores.  Also it is pretty simple to download the mediocre results for yourself.

SchoolName Rating Score
COLUMBIA HIGH SCHOOL 37.25 41.9
MAPLEWOOD MIDDLE SCHOOL 23.58 33.63
SOUTH ORANGE MIDDLE SCHOOL 72.1 63.65
CLINTON ELEMENTARY SCHOOL 81.52 71.6
JEFFERSON ELEMENTARY SCHOOL 69.67 61.96
SETH BOYDEN ELEMENTARY DEMONSTRATION SCHOOL 29.09 37.06
SOUTH MOUNTAIN ELEMENTARY SCHOOL 63.39 58.45
TUSCAN ELEMENTARY SCHOOL 55.63 53.93



lord_pabulum said:
 I'm going by the ratings and scores.  Also it is pretty simple to download the mediocre results for yourself.


SchoolName Rating Score
COLUMBIA HIGH SCHOOL 37.25 41.9
MAPLEWOOD MIDDLE SCHOOL 23.58 33.63
SOUTH ORANGE MIDDLE SCHOOL 72.1 63.65
CLINTON ELEMENTARY SCHOOL 81.52 71.6
JEFFERSON ELEMENTARY SCHOOL 69.67 61.96
SETH BOYDEN ELEMENTARY DEMONSTRATION SCHOOL 29.09 37.06
SOUTH MOUNTAIN ELEMENTARY SCHOOL 63.39 58.45
TUSCAN ELEMENTARY SCHOOL 55.63 53.93


 Nevertheless, this information is useless unless I actually understand the basis.


If opting out of the PARCC results in a zero, these really are entirely useless numbers.


FilmCarp you've hit the nail on the head. 


FilmCarp said:
If opting out of the PARCC results in a zero, these really are entirely useless numbers.

 As useless as PARCC.


If you look at the raw data posted by @sprout above, you can see that the entire 50 point difference in "rating" is due to a)  MMS missing on ELA growth and b) SOMS "exceeding" on math growth.  (The "score" -- which is the percentile ranking, is pretty meaningless without seeing the spread and the curve.)

We have NO information about how close, or not, those measures of growth were.  And we have no definition of "growth" which is often, and has been in the past, matching one year's cohort to that of the year before -- i.e. not measuring student growth.  (There had been talk of doing this differently, but that was before PARCC and I have no idea what they are doing now.)

And, once again, there is a significant demographic difference between the schools:  MMS has about 50% more economically disadvantaged students than SOMS.

I don't think anyone can really argue that in general the education a student gets at one school versus the other is significantly different.


jfburch said:

I don't think anyone can really argue that in general the education a student gets at one school versus the other is significantly different.

Why not? It's easy to say that on paper, they are exactly the same.  But delivery of that curriculum and experience could be very different.


I agree that content delivery is essential. 


Well sure.  But I did say in general.  And from my kids' experience (and kids of friends), delivery of curriculum and experience (which is already individual and subjective), varies from classroom to classroom--and sometimes that difference makes a difference--big or little.

SOMS and MMS  are not at all "exactly the same"--and I didn't suggest they were.  I just believe that they both have their strengths and weaknesses--of all sorts, but that on balance they provide an equal education.  And I don't believe those state numbers are picking up some major difference in what the two schools are doing or how well they do it.


Firstly, the data file has 16-17 data.  The article data is 17-18.  Secondly, this is basically a PARCC/ESSA addition exercise.  50% of the CHS data is PARCC and 90% of the Elem.  And so on.  All this “score” did was aggregate one district against each other without any weighting of any sort.  It’s utter crap. 


The school my son is in now scored much higher than the one he was in last year.  Academically I don't see much difference between our current school and the school he went to in Maplewood.  If PARCC is what those scores are based on, students skipping the test could easily account for the huge difference (on paper) between the two schools.

My son was in the 2nd grade last year in SOMSD.  On four different occasions the PARCC was mentioned to me, and not just in passing either.  He has to learn XYZ because next year he'll be taking PARCC.  He needs to do online homework so that next year when he knows how to do tests online for PARCC.  I didn't bring it up, it was brought up by his teacher and other staff to me and/or groups of parents.  One two of these occasions they were even showing up sample test questions on the white board (smart board?) and explaining what was expected from the students.  This wasn't one crazy teacher, this was multiple teachers/staff, with presentations having been prepared.

This year, the year he's actually going to take the test, his current teacher didn't bring it up once when we went in for the parent/teacher conference.   I brought it up to my son's teacher at the parent teacher conference because I remembered how much it was stressed last year, but his current teacher said they don't make a big deal out of it and they don't worry about wasting too much time preparing the kids for PARCC.  At other school events no staff mentioned it or preparing for it.  I asked my son about it, thinking that maybe they were going directly to the kids with this, but all he knows about PARCC is that it is later in the school year.  He said the teachers didn't really mention it much other than stating when in the year it would be

Some people will skip the tests no matter what, but I have to say that the high pressure of pushing the test and putting so much importance on it in SOMSD might be backfiring and is probably helping to drive away those parents who may have been on the fence about it. 



This is a question that has little to do with MSO schools, but with school and Title 1 funding.  I am asking about metrics for receiving Title 1 funding.  According to the Federal Govt website Title 1 schools are designated as those that 40% of children are considered economically disadvanted.


My son attended a Title 1 school last year. The school received a 100,000 Title 1 grant.  According to the NJDOE, There are 1.5% of students in the school considered economically disadvantaged.   In the district as a whole 2.5% are disadvantaged.


How is this possible?  Our state funding also went up significantly last year, so we are one of the most affluent districts in NJ and have some great new STEM labs, a pull out G&T program, and extra study hall tutoring for kids deemed as “needing additional support”.  It seems so unfair to places like MSO who have far more poorer students, failing infrastructure that we not only pay far less in property taxes, but get handouts for luxury items, like iPods for each child.


The state website for Title I funding is not very helpful. 

If I understand correctly, 40% economically disadvantaged students at a school allows for the designation of "Schoolwide" Title I  -- meaning you can use the Title I funds for the whole school's student body.

I'm not sure how schools below the 40% threshold become designated "Title I schools". But my understanding is that below 40% at a school can still received Title I funding, but funds are supposed to be used specifically for programming/staff that serve eligible students (not the general student body). This Title I funding may be based on census data rather than percentage of economically disadvantaged at a school. 

@campbell29 :

If your census area includes some poorer areas (maybe rural regions?), it could explain how the Title I funding was allocated, rather than based on school need.


sprout said:
The state website for Title I funding is not very helpful. 
If I understand correctly, 40% economically disadvantaged students at a school allows for the designation of "Schoolwide" Title I  -- meaning you can use the Title I funds for the whole school's student body.
I'm not sure how schools below the 40% threshold become designated "Title I schools". But my understanding is that below 40% at a school can still received Title I funding, but funds are supposed to be used specifically for programming/staff that serve eligible students (not the general student body). This Title I funding may be based on census data rather than percentage of economically disadvantaged at a school. 
@campbell29 :
If your census area includes some poorer areas (maybe rural regions?), it could explain how the Title I funding was allocated, rather than based on school 

As far as I know, we are only one census track, and are surrounded by also very affluent areas.  The fact that we got so much money just didn’t pass the smell test.



In order to add a comment – you must Join this community – Click here to do so.