Skip to main content
added 29 characters in body
Source Link
JimmyJames
  • 31.1k
  • 3
  • 59
  • 111

I think it's important to understand that the idea of the waterfall process, as most people understand it, was an error. I don't know exactly what went wrong but if you read the original waterfall paper by Winston Royce, he describes the outline of the process which came to be known as waterfall. Right after that, there's this line:

I believe in this concept, but the implementation described above is risky and invites failure.

SomehowEmphasis is mine. Somehow, this key observation was lost, and several decades were spent 'inviting failure' before it was generally accepted that the approach wasn't working. And to be sure, failure showed up again and again. I really encourage everyone interested in software development processes to read the original 'waterfall' paper linked above. If you don't stop reading after the first page-and-a-half, you'll find that Royce advocates a process much more like Agile than the waterfall process (as it came to be understood.)

In a nutshell, I think the answer to the question you are asking is that, in 'classic' waterfall, you spent a lot of time getting all the requirements and design done before you start developing things. This is simply inefficient. That is, it takes a longer time to get the same result you get with 'agile' methods. Often there's not enough time for the process to actually produce a good (or even a working) result.

There are a couple of reasons for this. First off, what are the developers doing while the requirements and design are being completed? Hopefully you have some other work for them to do for those months. The second big issue is that the end product almost never looks like what was first envisioned. My experience is that you can't really get good requirements from people until they see the software and start telling you what is wrong about it. Often the requirements you get prior to that are fundamentally flawed. They might not be feasible, solve any real problem, or fail to consider many factors that are key to the success of the product. All that time spent writing down those wrong ideas and designing around them is simply wasted. You are far more likely to get a working result quickly if you start creating software that can be put in front of the people who will be using it sooner.

I think it's important to understand that the idea of the waterfall process as most people understand it was an error. I don't know exactly what went wrong but if you read the original waterfall paper by Winston Royce, he describes the outline of the process which came to be known as waterfall. Right after that, there's this line:

I believe in this concept, but the implementation described above is risky and invites failure.

Somehow, this key observation was lost, and several decades were spent 'inviting failure' before it was generally accepted that the approach wasn't working. And to be sure, failure showed up again and again. I really encourage everyone interested in software development processes to read the original 'waterfall' paper linked above. If you don't stop reading after the first page-and-a-half, you'll find that Royce advocates a process much more like Agile than the waterfall process (as it came to be understood.)

In a nutshell, I think the answer to the question you are asking is that, in 'classic' waterfall, you spent a lot of time getting all the requirements and design done before you start developing things. This is simply inefficient. That is, it takes a longer time to get the same result you get with 'agile' methods. Often there's not enough time for the process to actually produce a good (or even a working) result.

There are a couple of reasons for this. First off, what are the developers doing while the requirements and design are being completed? Hopefully you have some other work for them to do for those months. The second big issue is that the end product almost never looks like what was first envisioned. My experience is that you can't really get good requirements from people until they see the software and start telling you what is wrong about it. Often the requirements you get prior to that are fundamentally flawed. They might not be feasible, solve any real problem, or consider many factors that are key to the success of the product. All that time spent writing down those wrong ideas and designing around them is simply wasted. You are far more likely to get a working result quickly if you start creating software that can be put in front of the people who will be using it sooner.

I think it's important to understand that the idea of the waterfall process, as most people understand it, was an error. I don't know exactly what went wrong but if you read the original waterfall paper by Winston Royce, he describes the outline of the process which came to be known as waterfall. Right after that, there's this line:

I believe in this concept, but the implementation described above is risky and invites failure.

Emphasis is mine. Somehow, this key observation was lost, and several decades were spent 'inviting failure' before it was generally accepted that the approach wasn't working. And to be sure, failure showed up again and again. I really encourage everyone interested in software development processes to read the original 'waterfall' paper linked above. If you don't stop reading after the first page-and-a-half, you'll find that Royce advocates a process much more like Agile than the waterfall process (as it came to be understood.)

In a nutshell, I think the answer to the question you are asking is that, in 'classic' waterfall, you spent a lot of time getting all the requirements and design done before you start developing things. This is simply inefficient. That is, it takes a longer time to get the same result you get with 'agile' methods. Often there's not enough time for the process to actually produce a good (or even a working) result.

There are a couple of reasons for this. First off, what are the developers doing while the requirements and design are being completed? Hopefully you have some other work for them to do for those months. The second big issue is that the end product almost never looks like what was first envisioned. My experience is that you can't really get good requirements from people until they see the software and start telling you what is wrong about it. Often the requirements you get prior to that are fundamentally flawed. They might not be feasible, solve any real problem, or fail to consider many factors that are key to the success of the product. All that time spent writing down those wrong ideas and designing around them is simply wasted. You are far more likely to get a working result quickly if you start creating software that can be put in front of the people who will be using it sooner.

added 77 characters in body
Source Link
JimmyJames
  • 31.1k
  • 3
  • 59
  • 111

I think it's important to understand that the idea of the waterfall process as most people understand it was an error. I don't know exactly what went wrong but if you read the Winston Royce's originaloriginal waterfall paper by Winston Royce, he describes the outline of the process which came to be known as waterfall. Right after that, there's this line:

I believe in this concept, but the implementation described above is risky and invites failure.

Somehow, this key observation was lost, and several decades were spent 'inviting failure' before it was generally accepted that the approach wasn't working. And to be sure, failure showed up again and again. I really encourage everyone interested in software development processes to read the original 'waterfall' paper linked above. If you don't stop reading after the first page-and-a-half, you'll find that Royce advocates a process much more like Agile than the waterfall process (as it came to be understood.)

In a nutshell, I think the answer to the question you are asking is that, in 'classic' waterfall, you spent a lot of time getting all the requirements and design done before you start developing things. This is simply inefficient. That is, it takes a longer time to get the same result you get with 'agile' methods. Often there's not enough time for the process to actually produce a good (or even a working) result.

There are a couple of reasons for this. First off, what are the developers doing while the requirements and design are being completed? Hopefully you have some other work for them to do for those months. The second big issue is that the end product almost never looks like what was first envisioned. My experience is that you can't really get good requirements from people until they see the software and start telling you what itis wrong about it. Often the requirements you get prior to that are fundamentally flawed. They might not be feasible, solve any real problem, or consider many factors that are key to the success of the product. All that time spent writing down those wrong ideas and designing around them is simply wasted. You are far more likely to get a working result quickly if you start creating software that can be put in front of the people who will be using it sooner.

I think it's important to understand that the idea of the waterfall process as most people understand it was an error. I don't know exactly what went wrong but if you read Winston Royce's original paper, he describes the outline of the process which came to be known as waterfall. Right after that, there's this line:

I believe in this concept, but the implementation described above is risky and invites failure.

Somehow, this key observation was lost, and several decades were spent 'inviting failure' before it was generally accepted that the approach wasn't working. And to be sure, failure showed up again and again. I really encourage everyone interested in software development processes to read the original 'waterfall' paper linked above. If you don't stop reading after the first page-and-a-half, you'll find that Royce advocates a process much more like Agile than the waterfall process (as it came to be understood.)

In a nutshell, I think the answer to the question you are asking is that, in 'classic' waterfall, you spent a lot of time getting all the requirements and design done before you start developing things. This is simply inefficient. That is, it takes a longer time to get the same result you get with 'agile' methods. Often there's not enough time for the process to actually produce a good (or even a working) result.

There are a couple of reasons for this. First off, what are the developers doing while the requirements and design are being completed? Hopefully you have some other work for them to do for those months. The second big issue is that the end product almost never looks like what was first envisioned. My experience is that you can't really get good requirements from people until they see the software and start telling you what it wrong about it. Often the requirements you get prior to that are fundamentally flawed. They might not be feasible, solve any real problem, or consider many factors that are key to the success of the product. All that time spent writing down those wrong ideas and designing around them is simply wasted. You are far more likely to get a working result quickly if you start creating software that can be put in front of the people who will be using it sooner.

I think it's important to understand that the idea of the waterfall process as most people understand it was an error. I don't know exactly what went wrong but if you read the original waterfall paper by Winston Royce, he describes the outline of the process which came to be known as waterfall. Right after that, there's this line:

I believe in this concept, but the implementation described above is risky and invites failure.

Somehow, this key observation was lost, and several decades were spent 'inviting failure' before it was generally accepted that the approach wasn't working. And to be sure, failure showed up again and again. I really encourage everyone interested in software development processes to read the original 'waterfall' paper linked above. If you don't stop reading after the first page-and-a-half, you'll find that Royce advocates a process much more like Agile than the waterfall process (as it came to be understood.)

In a nutshell, I think the answer to the question you are asking is that, in 'classic' waterfall, you spent a lot of time getting all the requirements and design done before you start developing things. This is simply inefficient. That is, it takes a longer time to get the same result you get with 'agile' methods. Often there's not enough time for the process to actually produce a good (or even a working) result.

There are a couple of reasons for this. First off, what are the developers doing while the requirements and design are being completed? Hopefully you have some other work for them to do for those months. The second big issue is that the end product almost never looks like what was first envisioned. My experience is that you can't really get good requirements from people until they see the software and start telling you what is wrong about it. Often the requirements you get prior to that are fundamentally flawed. They might not be feasible, solve any real problem, or consider many factors that are key to the success of the product. All that time spent writing down those wrong ideas and designing around them is simply wasted. You are far more likely to get a working result quickly if you start creating software that can be put in front of the people who will be using it sooner.

added 29 characters in body
Source Link
JimmyJames
  • 31.1k
  • 3
  • 59
  • 111

I think it's important to understand that the idea of the waterfall process as most people understand it was an error. I don't know exactly what went wrong but if you read Winston Royce's original paper, he describes the outline of the process which came to be known as waterfall. Right after that, there's this line:

I believe in this concept, but the implementation described above is risky and invites failure.

Somehow, this key observation was lost, and several decades were spent 'inviting failure' before it was generally accepted that the approach wasn't working. And to be sure, failure showed up again and again. I really encourage everyone interested in software development processes to read the original 'waterfall' paper linked above. If you don't stop reading after the first page-and-a-half, you'll find that Royce advocates a process much more like Agile than the waterfall process (as it came to be understood.)

In a nutshell, I think the answer to the question you are asking is that, in 'classic' waterfall, you spent a lot of time getting all the requirements and design done before you start developing things. This is simply inefficient. That is, it takes a longlonger time to get the same result you get with 'agile' methods. Often there's not enough time for the process to actually produce a good (or even a working) result.

There are a couple of reasons for this. First off, what are the developers doing while the requirements and design are being completed? Hopefully you have some other work for them to do for those months. The second big issue is that the end product almost never looks like what was first envisioned. My experience is that you can't really get good requirements from people until they see the software and start telling you what it wrong about it. Often the requirements you get prior to that are fundamentally flawed. They might not be feasible, solve any real problem, or consider many factors that are key to the success of the product. All that time spent writing down those wrong ideas and designing around them is simply wasted. You are far more likely to get a working result quickly if you start creating software that can be put in front of the people who will be using it sooner.

I think it's important to understand that the idea of the waterfall process as most people understand it was an error. I don't know exactly what went wrong but if you read Winston Royce's original paper, he describes the outline of the process which came to be known as waterfall. Right after that, there's this line:

I believe in this concept, but the implementation described above is risky and invites failure.

Somehow, this key observation was lost, and several decades were spent 'inviting failure' before it was generally accepted that the approach wasn't working. And to be sure, failure showed up again and again. I really encourage everyone interested in software development processes to read the original 'waterfall' paper linked above. If you don't stop reading after the first page-and-a-half, you'll find that Royce advocates a process much more like Agile than the waterfall process (as it came to be understood.)

In a nutshell, I think the answer to the question you are asking is that, in 'classic' waterfall, you spent a lot of time getting all the requirements and design done before you start developing things. This is simply inefficient. That is, it takes a long time to get the same result. Often there's not enough time for the process to actually produce a good (or even a working) result.

There are a couple of reasons for this. First off, what are the developers doing while the requirements and design are being completed? Hopefully you have some other work for them to do for those months. The second big issue is that the end product almost never looks like what was first envisioned. My experience is that you can't really get good requirements from people until they see the software and start telling you what it wrong about it. Often the requirements you get prior to that are fundamentally flawed. They might not be feasible, solve any real problem, or consider many factors that are key to the success of the product. All that time spent writing down those wrong ideas and designing around them is simply wasted. You are far more likely to get a working result quickly if you start creating software that can be put in front of the people who will be using it sooner.

I think it's important to understand that the idea of the waterfall process as most people understand it was an error. I don't know exactly what went wrong but if you read Winston Royce's original paper, he describes the outline of the process which came to be known as waterfall. Right after that, there's this line:

I believe in this concept, but the implementation described above is risky and invites failure.

Somehow, this key observation was lost, and several decades were spent 'inviting failure' before it was generally accepted that the approach wasn't working. And to be sure, failure showed up again and again. I really encourage everyone interested in software development processes to read the original 'waterfall' paper linked above. If you don't stop reading after the first page-and-a-half, you'll find that Royce advocates a process much more like Agile than the waterfall process (as it came to be understood.)

In a nutshell, I think the answer to the question you are asking is that, in 'classic' waterfall, you spent a lot of time getting all the requirements and design done before you start developing things. This is simply inefficient. That is, it takes a longer time to get the same result you get with 'agile' methods. Often there's not enough time for the process to actually produce a good (or even a working) result.

There are a couple of reasons for this. First off, what are the developers doing while the requirements and design are being completed? Hopefully you have some other work for them to do for those months. The second big issue is that the end product almost never looks like what was first envisioned. My experience is that you can't really get good requirements from people until they see the software and start telling you what it wrong about it. Often the requirements you get prior to that are fundamentally flawed. They might not be feasible, solve any real problem, or consider many factors that are key to the success of the product. All that time spent writing down those wrong ideas and designing around them is simply wasted. You are far more likely to get a working result quickly if you start creating software that can be put in front of the people who will be using it sooner.

Source Link
JimmyJames
  • 31.1k
  • 3
  • 59
  • 111
Loading