There are no answers that are true in 100% of situations. For example, I would prefer to do the first query when I use a fast enough network (anything 1Gbps or higher) and data transfer is free. The difference in the example you show is 40,000 bytes, but it's only 2.7% of the total.
On the other hand, if you need to optimize to minimize bandwidth usage, that's a different scenario. Like if you are transmitting data to a satellite over a very slow link. Or you have a lot of competing network traffic (enough to use up all the bandwidth), and saving 2.7% is significant. Or if you are using a cloud vendor that charges for total bytes transferred on the network.
You aren't taking into account the overhead of executing 1000 separate queries. That means 1000x the bytes sent to the database server, as you send queries. That takes some network bandwidth too. Also the database server needs to parse and optimize each query (MySQL does not cache optimization plans as some other RDBMS products do). And then begin executing the query, starting with an index lookup without the context of the previous query result.
"Performance" is a slippery term. You seem to be concerned only with bandwidth use, but there are other ways of measuring performance.
- Throughput (bytes per second)
- Latency (seconds per response)
- Wait time (seconds until the request begins)
All of these can be affected by the total load, i.e. number of concurrent requests. A busy system may have traffic congestion or request queueing.
You can see that this isn't a simple problem.
VARCHARor something else?SELECT?LOAD DATA? or something else? Is the download configured to fetch the entire resultset, then feed the rows to you one at a time?