The Baseball N = Large Problem
The N = Large problem is quite basic. Take data from a very large sample and use that data to determine some type of result (such as success/failure) with a new case not from the sample. For those familiar with machine learning, this should sound very familiar.
The N = Large problem is widely used in professional sports. In baseball, it is commonly used with the drafting/signing of amateur talent. After analyzing the height, weight, speed, velocity of fastball, accuracy of throws, speed of swing, walk rate, strikeout rate, etc. of a player, teams can determine lots of things:
- the probability of a player making it in the major leagues
- the probability they will injure their arm throwing
- the probability their performance will justify a $5 million dollar signing bonus
- ... a $2 million dollar signing bonus
- ... a $1 million dollar signing bonus
Teams are able to do this based on the large amount of historical data on players that have played in the league before and all the players that have played in the minor leagues and college.
The Baseball N = 1 Problem
In professional sports, especially baseball, each individual player's ability to succeed in the sport is never 100%. The vast majority simply cannot jump from being an amateur to pro [1]. While the historical data can give you good guidance for the probability a player can succeed, individual coaching, mental training, physical training, and mentoring ultimately decide the level of success a player has. Even for the most talented individuals, this can be the difference between a player making it to the big leagues or not. This training is unique to each specific player. Perhaps a player's swing is bad, or they need to get stronger to hit for more power, or their footwork needs to be better, or they need to learn how to throw a changeup, etc.
As an example, in an article from ESPN they highlight minor league player Alex Yarbrough. He is a reasonably talented player coming out of college. However his throwing mechanics are so bad, he will never make it to the major leagues as a second basemen. So the Angels must work on those mechanics to turn him into a major league capable second basemen.
This is the N = 1 problem. Almost every amateur player has weaknesses that prevent them from playing at the major league level. Each player's weaknesses are unique to them. The work needed to fix them is unique to each player. It requires special attention from coaches and trainers to deal with these weaknesses and help each player improve.
Employee Retention: The N = large vs N = 1 problem
In my opinion, there are two employee retention problems. However, only one is often discussed.
The first retention problem is the N = large problem, retention programs to help all employees. This is the most common type of employee retention issue that is discussed and the one most companies try to improve on. It's easier to do and effects the most employees. You can gather data about employees, conduct surveys, and determine best solutions for the employee population. In all honesty, they are probably also the best bang for the buck retention ideas. What are N = large retention solutions?
- Lets give the employees a recreation room
- Lets make ice cream parties for the employees
- Lets add additional educational opportunities
- Lets do an employee hackathon
- Lets try to ...
etc.
However, there is an additional N = 1 problem to retention. What is it? It's concentrating on the needs that each individual employee has. Every employee's pay, career goals, personal value in the work their doing, personal frustrations, etc. are all different. And that must be managed employee by employee. It's unlikely the efforts to solve retention in the N = large problem will apply here.
IMO, this is the retention problem that is typically not discussed or discussed very little. After all, it's hard. It requires managers and mentors (i.e. coaches) to look at each employee individually and determine what would be best to do for them. That's really hard. However, it may be the problem that needs to be discussed far more often.
[1] - According to Wikipedia, the last three players to do this were Mike Leake in 2010, Xavier Nady in 2000, and Jim Abott in 1989. It's very rare. Only 3 players in the last 26 years out of about 1000 players drafted a year.
No comments:
Post a Comment