javascript,node.js,callback,fs
As I said in my comment you are using an asynchronous call to load a file. You want the result of someMethod stored into the global variable resultVal. Except this isn't possible. When you call loadFile(done) a asynchronous call is made to the server. This call is being resolved by...
javascript,node.js,promise,bluebird,fs
Normally promise-driven code looks like this: operation.then(function(result) { return nextOperation(); }).then(function(nextResult) { return finalOperation(); }).then(function(finalResult) { }) There's a lot going on in your example, but the general idea would be something like: Promise.resolve().then(function() { if (nodir) { return fs.mkdir('somedirectory').catch(function(err) { log('error while trying to create the directory:\n\t %s', err);...
node.js,promise,bluebird,fs,node-request
You have to tell request that the data is binary: requestAsync(uri, { encoding : null }) Documented here: encoding - Encoding to be used on setEncoding of response data. If null, the body is returned as a Buffer. Anything else (including the default value of undefined) will be passed as...
1. Enable Compilation of .jsx files Inform Webpack use babel-loader when it encounters files ending in .jsx. See https://github.com/babel/babel-loader#usage So either add the following to your list of loaders: '{ test: /\.jsx$/, exclude: /node_modules/, loaders: ['react-hot-loader', 'babel-loader'] },' or modify your existing test so that it catches both .js and...
You are calling res.end() before any of your res.write() statements have executed. The callback to content.walk() is asynchronous which means it happens some indeterminate time in the future. You need to call res.end() AFTER all the res.write() statements are done. The way exports.walk() is structured, there is no way for...
javascript,xml,node.js,xml-parsing,fs
Your XML file isn't valid XML. It has multiple root nodes. Try parsing XML like this and it should work: <?xml version="1.0" encoding="utf-8"?> <!-- 1 = video , 2 = html , 3 = pdf --> <resource> <stage_type>2</stage_type> <resource_name>http://www.XXX</resource_name> </resource> You could rename that resource node to whatever you want,...
Node has a built in way to check for valid files function File(file){ if (file instanceof File) return file; if (!(this instanceof File)) return new File(file); if (typeof file !== "string") throw new Error("Useful error." + typeof(file)); var stat = fs.statSync(file); if ( stat && stat.isFile() ) { this.fullName =...
OK, after careful deugging problem identified. As Joseph mentioned, not related to fs.writeFile() at all. In my application there are in fact two file writes running "concurrently". The one listed in my question and another one, writing data progressively as it calculates some averages. The other, progressively writing function, had...
node.js,file-rename,fs,file-move
You can only rename file to file or directory to directory, but you can't mix them, such as in your situation (file to directory). In that case, add the name of the file to the newPath argument: fs.renameSync(__dirname + "\\file.txt", __dirname + "\\destination\\file.txt"); ...
javascript,node.js,post,stream,fs
Try to use the following app.use(bodyParser.text({ type : 'application/text-enriched', limit: '2mb' })); this work for me :) Regards, Shimi...
node.js,binary,dropbox-api,node-webkit,fs
Got it working! client.readFile("package.nw", {binary : true}, function(error, data) { if (error) { return showError(error); // Something went wrong. throw new Error(error); } toastr.info("Storing update.."); console.log(data); fs.writeFile("package.nw", data, 'binary', function(err) { if(err) { return showError(error); } toastr.info("Update complete!"); }); }); had to use {binary : true} to get it working!...
I'm guessing your interested in how an options parameter generally works in javascript. As opposed to what the parameters are, which are stated in the docs: options Object encoding String | Null default = 'utf8' mode Number default = 438 (aka 0666 in Octal) flag String default = 'w' Generally,...
javascript,node.js,asynchronous,fs
fs.exists is an oddball in that it doesn't provide an error parameter to its callback function. Instead it only provides a single exists parameter that indicates whether the file was found or not. Presumably, if there was an error, exists would be false. As such you need to wrap its...
UPDATE: the client post details helped. You're not posting a file stream (which would have worked) you're posting a form stream. The good news is there are good modules for handling form streams. You'll need to pipe the request stream into a form handling stream (such as busboy) which will...
node.js,express,openshift,mustache,fs
It's probably a lot easier to just use a handlebars Express views engine such as hbs. Example: // ... var hbs = require('hbs'); app.engine('hbs', hbs.__express); app.engine('html', hbs.__express); app.set('view engine', 'hbs'); app.set('views', __dirname + '/views'); hbs.localsAsTemplateData(app); app.get('/', function(req, res) { res.render('index.html', { message: 'Homepage!' }); }); // ... ...
You are trying to open a file for reading that doesn't exist (the dest file -- of course it doesn't exist). You want to open the file for writing. destFile = fs.createWriteStream(destFolder + '/' + file.thumbnail.name); ...
The ENOENT error is because the file or folder does not exist. The only way to get around this problem is to open a file that does exist. Perhaps the user's folder does not exist before you try this operation? Check to see if the folder exists. Another thing to...
On client side the code should be: $(document).ready(function(){ $("#save").click(function(event){ var content = editor.getValue(); console.log("This is content: " + content); $.ajax({ url:'/getcode', type:'POST', data: {'code': content}, processData: 'false', }); event.preventDefault(); return false; }); }); on the server side the code should be: router.post('/getcode', function(req, res){ var usercode = req.body.code; newname =...
javascript,json,node.js,string,fs
I'd recommend to prepend the separator instead, so that you dynamically can adjust it after the first call: file.write('[\n') var sep = ""; forEach(function(objectToAppen) { file.write(sep + JSON.stringify(objectToAppend)) if (!sep) sep = ",\n"; }); ...
image_file.end() is not atomic, you need to wait for the callback to be sure that it has been written. image_file.end([chunk], [encoding], [callback]) or you can watch for the finish on the writeStream image_file.on('finish', callback) ...
Judging from your path, it is extremely likely that you are testing an absolute path when you mean to be testing a relative path. If it is a relative path from the current process working directory, try omitting the initial slash: fs.stat('img/items/item.jpg', ... If it is a relative path from...
javascript,node.js,parsing,csv,fs
The csv module can do this. Their examples page has this simple asynchronous example with streams that you may be able to use as a starting point: var csv = require('csv'); var generator = csv.generate({seed: 1, columns: 2, length: 20}); var parser = csv.parse(); var transformer = csv.transform(function(data){ return data.map(function(value){return...
javascript,node.js,gzip,zlib,fs
Like in your previous question, fs.readFile is returning a string when you add 'utf8' as the encoding type. Remove this and readFile will return a buffer, which zlib.gunzip can work with. fs.readFile(tmp_path, function(err, body) { fs.unlink(tmp_path, function(err) { if (err) throw err; }); prepareBody(req, res, body); }); ...
You can use sync version fs.write -> fs.writeSync() Actually, saving log like string, I think append message to file is more look ok for me, you can look fs.appendFileSync here.
The second argument to fs.writeFile is a string or buffer, so you are actually writing the string 'http://url' which results in the image being corrupted. Instead, you want to pipe the response of a request to that url to the file, e.g. http.get(url, function (res) { res.pipe(fs.createWriteStream('./externalImages/'+imageId+'.jpg')); }); ...
Has it worked before? It's possible that there is still a leftover process that is connected to the port 8080. Try changing the port number to see if that is the problem.
It's hard to know all the things that might be wrong here since you only show a small piece of your code, but one thing that is wrong is the filename string. The \ character in Javascript is an escape mechanism so that the string 'C:\Users\i123\Desktop\test.txt' is not what you...
Ok, so I had a few extra minutes and decided to have a look for you. I refactored your code a little, but the basic structure should be easy to recognize. var baseDir = './test', path = require('path'), fs = require('fs'); // watch the directory for new files fs.watch(baseDir, function(event,...
You can call fs.unlink() on the finish event of res. Or you could use the end event of the file stream: var file = fs.createReadStream(fileName); file.on('end', function() { fs.unlink(fileName, function() { // file deleted }); }); file.pipe(res); ...
node.js,unit-testing,mocha,fs,chai
The better is to use mock-fs, if you provide it no file, it will return ENOENT. Just be careful to call restore after your test to avoid any impact on other tests. Add at the beginning var mock = require('mock-fs'); And the test before(function() { mock(); }); it('should throw an...
node.js,plugins,permissions,sails.js,fs
I don't think that node expose some sandboxing functionality so when you load a js code into node that code can do what it want. From your description yours plugins are more like browser javascript code so I think that you can use a headless browser to execute your code...
You can use the node-watch module: var watch = require('node-watch'); watch('somedir_or_somefile', function(filename) { console.log(filename, ' changed.'); }); If you are operating under the assumption that the file is only being appended to, you could store the number of bytes read so far in a variable, and then inside your watch...
You have to wait to look at the result until the last fs.readFile() operation has finished. These are async operations and they complete some time in the future. You are examining the result before any of them have finished. There are many ways to approach solving this, but this method...
You are mixing the flowing mode and non-flowing mode APIs a bit. From the docs: Note that the end event will not fire unless the data is completely consumed. This can be done by switching into flowing mode, or by calling read() repeatedly until you get to the end. Simplest...
You want to do i < list.length using less than < instead of less than or equal to <=. The array is 0 indexed, you start getting items from list[0], and there are items for all the i from 0 to list.length - 1, but list[list.length] does not exist....
The easiest way with Bluebird would be to use props: function getFiles(paths) { // an array of file/directory paths return Promise.props(paths.reduce(function (obj, path) { obj[path.split("/").pop()] = fs.statAsync(path).then(function (stat) { if (stat.isDirectory()) return fs.readdirAsync(path).map(function (p) { return path + "/" + p; }).then(getFiles); else if (stat.isFile()) return fs.readFileAsync(path); }).error(function (e) {...
Because it applies umask on your mode. Type umask in your console, you will see 022 or 0022. You can overwrite it for your node process with process.umask(newmask);...
It seems that you are looking to solve a similar problem to How to do `tail -f logfile.txt`-like processing in node.js? As per the first response I would look into the the node-tail module....
This sounds similar to a problem I had. Downloading files (not just pdf) resulted in strange results. This is more likely your issue....not the fs functions. Rather than using the built in node http stuff we chose to use the Request library (npm request) and performed downloads in this fashion:...
node.js,express,fs,angular-fullstack
It's readFile() not readfile() (uppercase F vs lowercase f).
You can use an async flow control module like async to kill the process after all files are written. I'd also recomment cluster.worker.disconnect() so that the node process will simple exit gracefully, but that isn't a requirement. async.forEach(seperatedArrayItem, function(item, done){ // append file and call 'done' when it is written....