I cannot import data from a CSV into PostGreSQL with dbWriteTable from R

Question:

I'm trying to import a CSV into a PostGreSQL database from R. On the PostGreSQL server, I created an empty database called "Data".

    library(RPostgreSQL)
    library(sqldf)

    drv <- dbDriver("PostgreSQL")

    con <- dbConnect(drv, 
             dbname="Dados",
             port = 1704, 
             host="localhost",
             user = "postgres",
             password = "dados")

Connection is successful:

    dbListConnections(drv)
    # [[1]]
    # An object of class "PostgreSQLConnection"
    # Slot "Id":
    # [1] 8652    0

Then I create any data.frame and then save it to a file

    tabela <- data.frame(var1 = c(1,2,3,NA),
                var2 = c(2,3,4,1))

    write.table(tabela,"tabela.csv", sep = "\t")

But then, when I run the dbWriteTable command, an error occurs:

    dbWriteTable(conn = con, name = "tabela1", value = paste0(getwd(),"/tabela.csv"))
    # Error in postgresqlExecStatement(conn, statement, ...) : 
    #   RS-DBI driver: (could not Retrieve the result : ERRO:  não pôde abrir 
    #   arquivo "c:/users/rogerio/desktop/tabela.csv" para leitura: Permission 
    #   denied )
    # [1] FALSE
    # Warning message:
    # In postgresqlImportFile(conn, name, value, ...) :
    #   could not load data into table

The "table1" is actually created in the DataBase, in PostGreSQL, but no data is imported.

The same procedure works perfectly in an SQLite connection…

Does anyone have any tips?

Answer:

I found that one way could be as follows:

  library(data.table)
  postgresqlWriteTable(con = con, name = "tabela1", 
                 value = fread("tabela.csv"))

But I still don't know if this is a great alternative… The fread is loading everything into RAM, like a data.frame/data.table. This is then passed to the SQL server.

If the bank is too big, this can be a big deal…

Scroll to Top