Error in Webscraping process of Youtube videos on R – NA' does not exist in current working directory

Question:

I am developing an academic work in which I must analyze the text of 25 videos selected from different YouTube channels. My advisor gave me a script on how he's developing this so I can work on my videos, but I'm barely done and the title error appears: NA' does not exist in current working directory ... . Here's the code:

library(abjutils)
library(tidytext)
library(reticulate)
reticulate :: use_python("users/agnes/anaconda3/python")
library(spacyr)
spacy_initialize("pt_core_news_sm")
library(tidyverse)
library(magrittr)
library(stm)
library(tm)
library(ggridges)
library(formattable)
#library(subtools)
options(scipen = 999)
## Preparaando os comandos para baixar as legendas
#Campos básicos
fields_raw <- c("id", "title", "alt_title", "creator", "release_date",
                "timestamp", "upload_date", "duration", "view_count",
                "like_count", "dislike_count", "comment_count")
#Formatando os capos
fields <- fields_raw %>% 
  map_chr(~paste0("%(", ., ")s")) %>% 
  # usar &&& como separador de fields
  paste0(collapse = "&&&") %>% 
  # acrescentar aspas no inicio e no final do string
  paste0('"', ., '"') 
channel_url <- "https://www.youtube.com/watch?v=rmZv19Iylu4"  

# montar query (comando) do youtube-dl
cmd_ytdl <- str_glue("youtube-dl -o {fields} -i -v -w --skip-download --write-auto-sub --sub-lang pt {channel_url}") 
view(cmd_ytdl)
# acrescentar diretorio
pasta_captions <- "C:/Users/agnes/Documents"  
fs::dir_create(pasta_captions) 
cmd <- str_glue("cd {pasta_captions} && {cmd_ytdl}") 
arquivos_captions <- dir(pasta_captions, pattern = '*.vtt', full.names = TRUE) 
amostra <- arquivos_captions[1] 
read_lines(amostra) [1:12]

Answer:

Hi, I didn't quite understand. do you want the comments or the subtitles? If it's the subtitles, there's an R package for that.

# https://github.com/jooyoungseo/youtubecaption
remotes::install_github("jooyoungseo/youtubecaption")

library(youtubecaption)
# "You can't do data science in a GUI":
url <- "https://www.youtube.com/watch?v=cpbtcsGE0OA"
caption <- get_caption(url)

# Save the caption as an Excel file and open it right away:
get_caption(url = url, savexl = TRUE, openxl = TRUE)

# pegar as legendas em português
url <- "https://www.youtube.com/watch?v=cpbtcsGE0OA"
caption <- get_caption(url,language ="pt")

Example source: https://github.com/jooyoungseo/youtubecaption

If you allow me, I suggest you read this text: https://www.curso-r.com/blog/2018-02-21-2cents/

Scroll to Top