Currently, files uploaded directly including thru the browser, drag and drop, or the API are limited to 64 GB in size. Remote sources using http, https, hdfs, asv, odata, odatas, dropbox, gcs and gdrive are also limited up to 64 GB, however using s3, the limit is 5 TB. Inline-sources are limited to 16 MB. We have plans to raise these limitations, but there hasn't been demand yet, so if you think you need larger files be sure to let us know.
On the other hand, BigML has some restrictions to process the sources. That is, when you are using a large source you will have to select a percentage of it to perform a task (e.g., creating a dataset). The size of this percentage will vary per subscription plan, as shown in the table below:
|Task Size||64 MB||1 GB||4 GB||8 GB||16 GB||32 GB||64 GB|
Notice that you can perform any task for free when they are under 16 MB.