Если кому надо сохранить историю своих постов со всеми картинками и коментариями на память или на случай если родной LiveJournal прикажет долго жить или уведут логин, то вот инструкция как это сделать с помощью языка R (https://www.r-project.org/).
myExportLJ <- function() { ########### R code to export LiveJournal blog posts together with images to local drive ########### ########### the code creates index file with all exported pages listed ########### ########### Create img folder in default folder library(XML) ## install.packages(XML) library(data.table) ## install.packages("data.table") # library(jpeg) #local constants ###################################### LJ_blog <- "http://papasonik.livejournal.com/" LJ_start_page <- "17466.html" LJ_number_of_posts_to_retrieve<-540 ###################################### LJ_current_file<-LJ_start_page # template for output file with Index index_file <- c("","", '', '', ""," Index list","" )
tryCatch ({ for (rep_count in 1:LJ_number_of_posts_to_retrieve) {
print(paste("Starting to read file: ", paste(LJ_blog, LJ_current_file, sep="")))
con <- url(paste(LJ_blog, LJ_current_file, sep="")) htmlCode <- readLines(con, n = -1L, ok = TRUE, warn = FALSE, skipNul = FALSE) close(con)
# prepare one large text oneliner<-paste(htmlCode, sep="", collapse="\n")
# List of all links to files into a table DT <- data.table(filename=filelist)
# get only JPG files from livejournal DT_files_url<- DT[filename %like% "http.*jpg"]
if (nrow(DT_files_url)>0) { for (i in 1:nrow(DT_files_url)) { # extract file name with folder img added file_name <-paste("img/i", gsub("_jpg",".jpg",gsub(":","",gsub("/","_",gsub("\\.","_",gsub("^http","",DT_files_url[i]))))), sep="" )
# check if file exists if (file.exists(file_name) ) { print(paste("WARNING: file exists: ", file_name)) } else {# download file one by one and save into img
if (regexpr("?gl",toString(DT_files_url[i]), fixed=TRUE)>1) {