How To Stream Post Data Into Python Requests?
Solution 1:
request
does take an iterator or generator as data
argument, the details are described in Chunk-Encoded Requests. The transfer encoding needs to be chunked in this case because the data size is not known beforehand.
Here is a very simle example that uses a queue.Queue
and can be used as a file-like object for writing:
import requests
import queue
import threading
classWriteableQueue(queue.Queue):
defwrite(self, data):
# An empty string would be interpreted as EOF by the receiving serverif data:
self.put(data)
def__iter__(self):
returniter(self.get, None)
defclose(self):
self.put(None)
# quesize can be limited in case producing is faster then streaming
q = WriteableQueue(100)
defpost_request(iterable):
r = requests.post("http://httpbin.org/post", data=iterable)
print(r.text)
threading.Thread(target=post_request, args=(q,)).start()
# pass the queue to the serializer that writes to it ...
q.write(b'1...')
q.write(b'2...')
# closing ends the request
q.close()
Solution 2:
The only way of connecting a data producer that requires a push interface for its data sink with a data consumer that requires a pull interface for its data source is through an intermediate buffer. Such a system can be operated only by running the producer and the consumer in "parallel" - the producer fills the buffer and the consumer reads from it, each of them being suspended as necessary. Such a parallelism can be simulated with cooperative multitasking, where the producer yields the control to the consumer when the buffer is full, and the consumer returns the control to the producer when the buffer gets empty. By taking the generator approach you will be building a custom-tailored cooperative multitasking solution for your case, which will hardly end up being simpler compared to the easy pipe-based approach, where the responsibility of scheduling the producer and the consumer is entirely with the OS.
Post a Comment for "How To Stream Post Data Into Python Requests?"