There are tons of different processors and architectures dating from the late 1940s to present; far too many to detail in a forum post. Intel's tick-tock model has the company releasing a new architecture and die shrink in alternating years. Prior to this die shrinks and architectures came more sporadically.
If we were to clump these into "generations", I would do it as follows:
From the dawn of computing until the mid-1960s is the "pre ASCII era" where most computers used radically different memory structures, architectures, and standard data sizes. Many early computers used 36 bit numeric sequences because this provided integer accuracy to 10 decimal digits. The standardization of ASCII caused everything to fall together into 8 bit multiples.
From the mid 1960s to the early 1980s is the "big iron" era. Computers were primarily the domain of large corporations and academic institutions. This era superimposes upon advents in multiprocessing and operating theories which saw computers transition from the simple batch processing of the 1950s to resource managed environments similar to those we still use today.
The mid 1970s gave rise to the "PC era" when the functions of big iron which spanned multiple chips, boards and cabinets were condensed into single chip packages suitable for much smaller devices. A number of manufacturers such as Motorola, MOS, Zilog and the now infamous Intel all had produced 8-bit microprocessors which fit inside a single package. In the early 1980s IBM marketed the IBM PC which was targeted at dominating the then small hobby PC market and opening it up to general use.
Functionally, not much has changed since the mid 1980s. Chip manufacturers have simply gotten better at doing the same thing, there have been very few radical changes such as there were from the 1960s-1970s. However, in the interest of categorization I would suggest that the era from the early 1980s to the year 2000 be known as the "pre-millennial" era and the years 2000 onward be known as the "post-millennial" era. The year 2000 nicely coincides with the time that the internet began to explode and stakeholders around the world began to take information seriously. Simultaneously, computers have made their way into every single device imaginable.