Skip to main content
Skip table of contents

Pre-requisites for Data Import/ Export

  1. Build and deploy an Application server. This is used for the following steps...

    1. Importing the data through SFTP

    2. Exporting the data through SFTP

  2. Build and deploy an SFTP (Secure FTP) server.

    1. This server has the following folders. The naming convention folder the folders follow the same standard pattern.

      • /SFTP/upload/CategoryFeed >>> All the source and destination files for this type of feed is saved in this folder.

      • /SFTP/upload/SubCategoryFeed >>> All the source and destination files for this type of feed is saved in this folder.

      • /SFTP/upload/ProductFeed >>> All the source and destination files for this type of feed is saved in this folder.

      • /SFTP/upload/PackageFeed >>> All the source and destination files for this type of feed is saved in this folder.

      • /SFTP/upload/ImageFeed >>> All the source and destination files for this type of feed is saved in this folder.

      • /SFTP/download/priceFeed >>> All the source and destination files for this type of feed is saved in this folder.

      • /SFTP/download/itemFeed >>> All the source and destination files for this type of feed is saved in this folder.

    2. Each of the above folders, will have the following three sub folders…

      • ../Failure/ >>> If the Import/Export 'does not completes successfully' the CSV file is moved to this folder.

      • ../Success/ >>> If the Import/Export completes successfully the CSV file is moved to this folder.

      • ../Logs/    >>> All the logs are automatically stored in this folder.

        • Each of the log file will have specific information whether the process completes successfully or encounters a failure.

        • The file naming convention for the log files can be configured in the Python script being used to process the import/export.

        • The file naming convention has the time-stamp included in its name in the format _yyyymmdd_HHMMSS.

          • This ensure that two logs generated concurrently do not accidently over-write each other.

        • If there is a logical mismatch between datatype for a specific field, the log file will have this piece of information marked in it.

          • See the file's screenshot with descriptive references. Click here to open an actual sample file

        • If the script was trying to 'create' an item (with a specific item ID) that already exists, the system fails to create it. And the log file will have this piece of information marked in it.

          • See the file's screenshot with descriptive references. Click here to open an actual sample file.

        • If the script completes with any errors and all records are imported successfully, then also a log file is generated which indicates so.

          • See the file's screenshot with descriptive references. Click here to open an actual sample file.

    3. All these folders should have read and write access enabled.

  3. Transferring the files between the servers is done through Python script. If you need assistance to install Python on the server, please contact UBS Project Managers.

  4. Need to setup SSH key based authentication to the SFTP server. To get the above setup and authentication configured, please contact UBS Project Managers.

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.