Wednesday, September 29, 2010

The Learning Network - Teaching and Learning with the New York Times

I wanted to let you know about The Learning Network blog which provides teaching and learning materials and ideas based on New York Times content. They provide not only lesson plans that span across subject areas and levels, but also News Quizzes and Opinion questions. You can also learn the Word of the Day, try out Test Yourself questions, and complete a Fill-In or read their Poetry Pairings.

According to their site, their mission is to "offer rich and imaginative materials for teaching and learning using New York Times content."

The blog also states:
Every weekday we offer new educational resources based on the articles, photographs, videos, illustrations, podcasts and graphics published in The New York Times – all for free.

We invite parents, teachers and students who are 13 and older to use our ideas and tools. We hope that through posting your comments you’ll become part of an ongoing conversation about teaching and learning.

For the 2010-’11 school year, here are 11 great ways to use our blog.

Throughout the year, we offer the following regular features:
  • Lesson Plans — Daily lesson plans based on New York Times content.
  • Student Opinion — News-related questions that invite response from students age 13 and older.
  • Word of the Day — Vocabulary words in the context of recent Times articles.
  • Test Yourself — Questions based on Times content that aim to strengthen literacy and numeracy skills.
  • 6 Q’s About the News — An activity in which students answer basic questions (Who, What, Where, When, Why and How) about an article.
  • News Quiz — Interactive daily news quizzes on current top stories.
  • On This Day in History — Listings of historical events and more for each day of the year.
  • Student Crossword — Topical puzzles geared toward teens.
  • Fill-Ins — Times articles from which word and phrases have been dropped. Fill in the blanks with your own words, or choose from a scrambled list of the words that were removed.
  • Poetry Pairings — A weekly collaboration with the Poetry Foundation in which we feature a work from its American Life in Poetry project alongside content from The Times that somehow echoes, extends or challenges the poem’s themes.
We hope teachers will use our blog to get and exchange ideas, parents to share how news stories have resonated at home, and students to express themselves on everything from politics to popular culture.
Just like on this blog, you can join the conversation by commenting on any post. We'd love to hear what you think!  J.T.

Tuesday, September 21, 2010

How Should We Measure an Organization’s Level of Psychometric Expertise?

Greg Pope, the Analytics and Psychometrics Manager for Questionmark, made a very interesting post to their blog yesterday. I think the post really makes you reflect on where your organization is currently operating at, from a psychometric point of view, but it also brings about the question of where is your your organization headed?  Considering the level of your organization, are your assessments psychometrically sound? Are you staffed at the appropriate level?

Greg's post also brings about a myriad of other questions that I need to think about, but I hope you will take a few minutes to read this very interesting post and make a response to this blog or to Questionmark's blog (the link is at the end of the post).

Here is Greg's post:
A colleague recently asked for my opinion on an organization’s level of knowledge, experience, and sophistication applying psychometrics to their assessment program. I came to realize that it was difficult to summarize in words, which got me thinking why. I concluded that it was because there currently is not a common language to describe how advanced an organization is regarding the psychometric expertise they have and the rigour they apply to their assessment program. I thought maybe if there were such a common vocabulary, it would make conversations like the one I had a whole lot easier.

I thought it might be fun (and perhaps helpful) to come up with a proposed first cut of a shared vocabulary around the levels of psychometric expertise. I wanted to keep it simple, yet effective in allowing people to quickly and easily communicate about where an organization would fall in terms of their level of psychometric sophistication. I thought it might make sense to break it out by areas (I thought of seven) and assign points according to the expertise/rigour an organization contains/applies. Not all areas are always led by psychometricians directly, but usually psychometricians play a role.


1. Item and test level psychometric analysis
a. Classical Test Theory (CTT) and/or Item Response Theory (IRT)
b. Pre hoc analysis (beta testing analysis)
c. Ad hoc analysis (actual assessment)
d. Post hoc analysis (regular reviews over time)

2. Psychometric analysis of bias and dimensionality
a. Factor analysis or principal component analysis to evaluate dimensionality
b. Differential Item Functioning (DIF) analysis to ensure that items are performing similarly across groups (e.g., gender, race, age, etc.)

3. Form assembly processes
a. Blueprinting
b. Expert review of forms or item banks
c. Fixed forms, computerized adaptive testing (CAT), automated test assembly

4. Equivalence of scores and performance standards
a. Standard setting
b. Test equating
c. Scaling scores

5. Test security
a. Test security plan in place
b. Regular security audits are conducted
c. Statistical analyses are conducted regularly (e.g., collusion and plagiarism detection analysis)

6. Validity studies
a. Validity studies conducted on new assessment programs and ongoing programs
b. Industry experts review and provide input on study design and finding
c. Improvements are made to the program if required as a result of studies

7. Reporting
a. Provide information clearly and meaningfully to all stakeholders (e.g., students, parents, instructors, etc.)
b. High quality supporting documentation designed for non-experts (interpretation guides)
c. Frequently reviewed by assessment industry experts and improved as required

Expertise/rigour points
0. None: Not rigorous, no expertise whatsoever within the organization
1. Some: Some rigour, marginal expertise within the organization
2. Full: Highly rigorous, organization has a large amount of experience

So an organization that has decades of expertise in each area would be at the top level of 14 (7 areas x 2 for expertise/rigour in each area = 14). An elementary school doing simple formative assessment would probably be at the lowest level (7 areas x 0 expertise/rigour = 0). I have provided some examples of how organizations might fall into various ranges in the illustration below.

There are obviously lots of caveats and considerations here. One thing to keep in mind is that not all organizations need to have full expertise in all areas. For example, an elementary school that administers formative tests to facilitate learning doesn’t need to have 20 psychometricians working for them doing DIF analysis and equipercentile test equating. Their organization being low on the scale is expected. Another consideration is expense: To achieve the highest level requires a major investment (and maintaining an army of psychometricians isn’t cheap!). Therefore, one would expect an organization that is conducting high stakes testing where people’s lives or futures are at stake based on assessment scores to be at the highest level. It’s also important to remember that some areas are more basic than others and are a starting place. For example, it would be pretty rare for an organization to have a great deal of expertise in the psychometric analysis of bias and dimensionality but no expertise in item and test analysis.

I would love to get feedback on this idea and start a dialog. Does this seem roughly on target? Would it would be useful? Is something similar out there that is better that I don’t know about? Or am I just plain out to lunch? Please feel free to comment directly on the Questionmark blog.

On a related note, Questonmark CEO Eric Shepherd has given considerable thought to the concept of an “Assessment Maturity Model,” which focuses on a broader assessment context. Interested readers should check out:

Monday, September 20, 2010

Google Instant - Search As You Type

Just in case you missed the news, Google has introduced Google Instant.

Google Instant is a new search enhancement that shows results as you type. The purpose is to help you get better search results, faster. Google's key assertion is that people type more slowly than they read so they have created a way that allows you to scan a results page while you type.

According to Google:
The most obvious change is that you get to the right content much faster than before because you don’t have to finish typing your full search term, or even press “search.” Another shift is that seeing results as you type helps you formulate a better search term by providing instant feedback. You can now adapt your search on the fly until the results match exactly what you want. In time, we may wonder how search ever worked in any other way.

  • Faster Searches: By predicting your search and showing results before you finish typing, Google Instant can save 2-5 seconds per search.
  • Smarter Predictions: Even when you don’t know exactly what you’re looking for, predictions help guide your search. The top prediction is shown in grey text directly in the search box, so you can stop typing as soon as you see what you need.
  • Instant Results: Start typing and results appear right before your eyes. Until now, you had to type a full search term, hit return, and hope for the right results. Now results appear instantly as you type, helping you see where you’re headed, every step of the way.
Watch the video below, where Jonathan Effrat, the Product Manager, talks about Google Instant:

Wednesday, September 15, 2010

CareerTech's Online Learning Focus for 2010

The Oklahoma Department of CareerTech has published a short two-page document that highlights seven online learning initiatives for 2010.

Are you a new teacher looking for a mentor or a learning management system? If so, take a look at ICAT.

Are you in the market for eBooks and eCourses? Take a look at what eCIMC has to offer.

What about online competency testing? The CareerTech Testing Center offers 100+ skills standards and online competency tests.

These are just a few of the highlighted topics so I hope you will take a few minutes and discover the rest of the great online learning initiatives taking place at the Oklahoma Department of CareerTech!

New York Times Advises That Tests Help You Retain Learning

Here is a blog post by John Kleeman, the founder of Questionmark, that focuses on the retention of learning through testing:
I’d like to draw your attention to a thought-provoking article in the New York Times earlier this week about the best way to learn.

One interesting observation in the article is that although you might think going and staying at a quiet place to study is the best way to learn, this isn’t the case. It’s actually easier to learn if you move around to different places! It would seem that when the outside context varies, it’s easier to put on the neural scaffolding that helps retain something in memory.

And, mirroring papers by Dr. Will Thalheimer commissioned by Questionmark (see The Learning Benefits of Questions and Providing Learners with Feedback), tests also help the retention of learning. In particular the New York Times describes an experiment at Washington University in St. Louis where two sets of students studied a reading passage in different ways. One set studied it twice in back-to-back sessions, the other set studied it once and then took a practice test on it, within the same time. As you can see in the diagram below, students who studied only learned the information well at the time, but forget about half of it within a week. But those who studied and had a practice test, retained much more of the information.
The bottom line from the research is that taking memory tests improves long-term retention. Tests don’t just measure learning: the act of taking a test helps you retain information you have learned.

Monday, September 13, 2010

Cheating Our Character

I found an interesting article by Emily Johnson, Staff Columnist, in the July 9, 2010 issue of The Dartmouth. As Johnson summizes,
In comparison, Dartmouth sets an admirable example with regard to academic honesty. The first principle of the Academic Honor Policy states: “In recognizing the responsibility of students for their own education, [the Faculty of Dartmouth College] assumes intellectual honesty and integrity in the performance of academic assignments, both in the classroom and outside.” This assumption of integrity helps build student character. More importantly, it serves to prepare students for the future.

Unless there is clear cause to suspect misbehavior, anti-cheating efforts by universities should target the incentives to cheat before the student sits down to take a test. This method, which Dartmouth successfully employs, coupled with a reevaluation of the purpose of homework and exams, will seek to significantly reduce cheating the right way. An anti-cheating policy in this mold is the true “frontier” of the movement.
I personally think that Honor Codes are a great way to define cheating and the consequences for cheating in your school or program. These codes should be defined at the beginning of each academic year and the student (and their parents in K-12 education) should sign the agreement prior to any assignment or testing. Although there will always be some unethical students who prefer to cheat than work hard or who think getting a good grade is above all other concerns, we must reinforce honorable behavior in our students in hopes that appropriate behavior will be demonstrated throughout their professional lives.

Here is a link to a previous Honor Code post entitled: Honor Codes: Do They Result in Academic Honesty?

Do any of you use Honor Codes? Do you think they are needed? J.T.

Thursday, September 2, 2010

Department of Treasury Proposes Financial Education Core Competencies

I was reading the CTE Policy Watch Blog last night and I noticed their post from a few days ago concerning U.S. Department of the Treasury's notice in the Federal Register inviting the public to comment on a proposed set of financial education core competencies.

The CTE Policy Watch Blog post states:

According to the Financial Literacy and Education Commission, the financial education field lacks a standard curriculum. Specifically, there is no agreement on the appropriate basic content for financial literacy and education. In response, the Commission developed a set of core competencies, which would help establish a better understanding of what individuals should know and the basic concepts program providers should cover. The Department identified five core concept areas:
  • Earning
  • Spending
  • Saving
  • Borrowing
  • Protecting against risk
Specific core competencies for each concept area can be found in the table in the Federal Register.
Ultimately, the Department’s goal is to format these core competencies in such a way that is easily remembered, such as the ``food pyramid,'' and they could have an impact on financial literacy instruction in schools and CTE programs. Therefore, it is important to comment by September 12 if you have any input on the Department’s proposed set of financial education core competencies. Comments are requested specifically on whether the list of Core Competencies is complete and whether there are portions that should be deleted, revised, or expanded. Written comments can be sent via e-mail to or by mail to the Department of the Treasury, Office of Financial Education and Financial Access, 1500 Pennsylvania Avenue, NW, Washington, DC 20220.

Please visit the Association for Career and Technical Education's (ACTE) website for more great information on career and technology education.

Here are a couple of additional Financial Literacy resources:
Related Posts Plugin for WordPress, Blogger...