Modeling and forecasting the diffusion of innovations has been an academic interest since the 1960s and has since become one of the most influential topics in marketing and management science. Of the main models used for innovation diffusion, the Bass model, which was first published in 1969, quickly became the most popular model to describe the adoption process of new products in a population. The model accounts for the initial importance of innovators and the “word of mouth” effect among adopters in determining the speed of adoption.

## Bass Diffusion Equation

The differential equation formulation of the Bass Model is as follows:

where is the cumulative adopters at time . is the coefficient of innovation, implying that there is a certain percentage of researchers who independently begin to adopt the innovation at each time step. is the coefficient of imitation, representing the “word of mouth” effect where the adopters spread the innovation to potential adopters at each time step. is the total number of adopters over the whole period of the innovation.

## Ordinary Least Squares (OLS)

The discrete analog of the Bass model allows the formulation to be written accordingly:

Then we can estimate parameters $p$, $q$, and $m$ by employing the OLS method to estimate parameters , , and , given the analog . The residual sum of squares can be written as

It is well known that , the instance of that minimizes the , has analytical solution

so we can compute it very easily.

## Non-Linear Least Squares (NLS) Optimization

This approach seeks to minimize the objective function:

where is the actual number of total adopters at time and is the predicted number of total adopters based on the Bass model, given the parameter vector .

## Genetic Algorithm (GA)

Genetic algorithm (GA) is a searching technique to look for exact or approximate solutions for optimization and searching problems. It is considered as a global search heuristic. GA uses techniques inspired by evolutionary biology such as inheritance, mutation, selection, and crossover. A typical genetic algorithm requires a genetic representation of the solution domain and a fitness function to evaluate the solution domain. In GA, an abstract representation of candidate solutions is called chromosomes, and it could be used in an optimization problem to evolve toward better solutions. Solutions are represented in some encoding method, such as binary encoding. A fitness function is a particular type of objective function that prescribes the optimality of a solution so that a particular chromosome may be ranked against all the other chromosomes. The evolution usually starts from a population of randomly generated individuals. In each generation, the fitness of every individual in the population is evaluated. Based on their fitness, the fittest group of individuals are selected and through reproduction, crossover or mutation to form a new population. The new population is then used in the next iteration of the algorithm. Commonly, the algorithm terminates when either a maximum number of generations has been produced, or a satisfactory fitness level has been reached for the population.

## Maximum Likelihood Estimation (MLE)

Let denote the probability density function of adoption at time t, the cumulative density function. Then the Bass equation can be written as

This is an ordinary differential equation that can be solved with the initial condition . Take the integral of the both sides of the above equation we get

where and . However this solution assumes all consumers in the population eventually adopt the new product, i.e., the cumulative density function ends at 1. Bass equation is a diffusion system where the eventually adopting probability is , so the analytical solution to the cumulative density function should be

Note that the population in this system is , where is the number of population in our previous models. We know the likelihood function is

where are the time points of the data, . and is the number of people who did not adopt by time , i.e., . Thus the analytical form of the log-likelihood function is given by

After solving for , and , we can easily get the estimators of , and by the one-to-one correspondence between them.

As it doesn’t look like I yet have access to post a post of my own, I add my post in a comment. Hopefully I’ll soon be able to move it to a proper post.