Thanks, the batched inputs thing makes sense, I misunderstood! I also didn't 
mean to imply that batch size itself is unnecessary- I do think it's a fairly 
universal concept for data loading (except for perhaps dynamic models where the 
input shape changes for each instance, but in this case you could dynamically 
batch the same shapes and/or set the batch size to 1 and manually aggregate 
results). In any case, supporting non-batched stuff seems out of the scope of 
DataLoader so I wouldn't worry about it too much.

Thanks for clarifying!





---
[Visit 
Topic](https://discuss.tvm.apache.org/t/dataloader-an-api-to-wrap-datasets-from-other-machine-learning-frameworks/9498/10)
 to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.apache.org/email/unsubscribe/1b04d5c0e0cdde9e4f53dcd03788c660f1e9be16267729fb0c140e9d68ee883d).

Reply via email to