Stacked Q&A

Efficient data import PostgreSQL DB
  • 1 voting
  • 2022-05-24 00:00

    I just designed a Pg database and need to choose a way of populating my DB with data, the data consists of txt and csv files but can generally be any type of file containing characters with delimiters, I'm programming in java in order to the data to have the same structure (there's lots of different kinds of files and I need to find what each column of the file represents so I can associate it with a column of my DB) I thought of two ways:

    • Convert the files into one same type of file (JSON) and then get the DB to regularly check the JSON file and import its content.

    • Directly connect to the database via JDBC send the strings to the DB (I still need to create a backup file containing what was inserted into the DB so in both cases there is a file created and written into).

    Which would you go with time efficiency wise? I'm kinda tempted into using the first one as it would be easier to handle a json file in the DB. If you have any other suggestion that would also be welcome!

  • 2 555 0 0
첨부 파일
답변
총 2개의 답변이있습니다.
  • 답변일: 2022-05-24 00:00

    JSON or CSV

    If you have the liberty of converting your data either to CSV or JSON format, CSV is the one to choose. This is because you will then be able to use COPY FROM to bulk load large amounts of data at once into postgresql.

    CSV is supported by COPY but JSON is not.

    Directly inserting values.

    This is the approach to take if you only need to insert a few (or maybe even a few thousand) records but not suited for large number of records because it will be slow.

    If you choose this approach you can create the back up using COPY TO. However if you feel that you need to create the backup file with your java code. Choosing the format as CSV means you would be able to bulk load as discussed above.

  • 0 0 채택된 답변 퍼블 채택!
Tages
    활동순위
    유저
      파트너사