Is it possible to
50K messages[Single partition]
Spark Streaming doesn't work like that. The way it does work is of an infinite stream of data flowing in and getting processed at each batch interval. This means that if you want to signal a logical "end of batch", you'll need to send a message indicating that this batch of data is over, allowing you to send the processed messages to an output sink of your desire.
One way you can achieve this is by using stateful streams which aggregate data across batches and allow you to keep state between batch intervals.