Generating datasets and random files of various size

Usually, data that mimics real-world data is always the best, but sometimes we need an assortment of files of various content and size for validation testing without delay. Imagine that you have a web server and it is running some sort of application that accepts files for storage. However, the files have a size limit being enforced. Wouldn't it be great to just whip up a batch of files in an instant?

To do this, we can use some few file system features such as /dev/random and a useful program called dd. The dd command is a utility that can be used to convert and copy files (including devices due to Linux's concept of everything is a file, more or less). It can be used in a later recipe ...

Get Bash Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.