Hi,
What is the max flat file size a file connection point can consume?
I need to process files of about 160Mb in size, but it looks like ION does not like it.
Thanks,
KK
Hi Kirill, Maximum file size depends on the license tier: https://docs.infor.com/inforosulmt/xx/en-us/usagelimits/default.html?helpcontent=nkb1720800864258.html . The recommended size is 5 mb: https://docs.infor.com/inforosulmt/xx/en-us/usagelimits/default.html?helpcontent=nkb1720800871140.html We are working on increasing the maximum file size that ION can handle. What kind of file is it? Do you need to transfer this file (Lift & Shift) or do you want to process (Mapping, Scripting, Routing, Monitor, etc.) this file?
Kind regards, Danil.
Krill, there is are Batch and Streaming APIs to write the data to data lake if you have that option in AirFlow.
https://docs.infor.com/inforos/2024.x/en-us/useradminlib_cloud/default.html?helpcontent=datafabrug/llv1631199543458.html
If you require an ION Connection point then the the limits apply as it will need to go through a Flow.
For your 'upload files to dataLake ' … as Danil asked, nothing in the ION flow except connection point to collect the file and first step to write to data lake (or other destination). There is no filter, mapping, scripting, … Generally referred to as Lift and Shift. What type of connection point? and what type of type (XML, JSON, DSV, binary..)? Read Files, so an application will use the Data Fabric File APIs to read them.
https://docs.infor.com/inforos/2024.x/en-us/useradminlib_cloud/default.html?helpcontent=datafabrug/pba1631199548960.html
And then write them back as newer version or different file? Using the Batch or Streaming API.
Hi Danil,
I need to do two things:
I can upload raw files to DataLake with Airflow as well, is there a limit for file size in DataFabric API?
Thanks
Hi Kevin,
It's file connection point (SFTP). Files are just text files.
Permission Problem You need the session.valid permission to do that.