If program_spitting_out_text runs continuously and keeps it's file open, there's not a lot you can do. Even deleting the file won't help since it will still continue to write to the now "hidden" file (data still exists but there is no directory entry for it) until it closes it, at...
python,scikit-learn,pipeline,feature-selection
The pipeline calls transform on the preprocessing and feature selection steps if you call pl.predict. That means that the features selected in training will be selected from the test data (the only thing that makes sense here). It is unclear what you mean by "apply" here. Nothing new will be...
You could use sed. $ sed 's/.*\(0x[0-9a-f][0-9a-f][0-9a-f][0-9a-f]\).*/\1/' file 0x8001 channel 1: 123 channel 2: 234 channel 3: 345 0x8002 channel 1: 456 channel 2: 567 channel 3: 678 I assumed that there is only one hexadecimal string like 0x8001 present in a line....
python,google-app-engine,mapreduce,pipeline
At first, I attempted thinkjson's CloudStorageLineInputReader but had no success. Then I found this pull request...which led me to rbruyere's fork. Despite some linting issues (like the spelling on GoolgeCloudStorageLineInputReader), however at the bottom of the pull request it is mentioned that it works fine, and asks if the project...
Finally fixed the issue. The problem was that I had to "unblock" the jpegtran.exe file. That's it. ...
string,haskell,pipeline,conduit
I'm not sure I'm following all the questions you ask. I'll just address the type question: the error is only arising because of the signature you supplied. Doesn't this work? sinkPhotos :: MonadResource m => Sink Photo m () sinkPhotos = do mphoto <- await case mphoto of Nothing ->...
asp.net,pipeline,asp.net-4.5,asp.net-5,.net-4.6
The biggest difference in my opinion is the modularity of the new request pipeline. In the past, the application lifecycle followed a relatively strict path that you could hook into via classes implementing IHttpModule. This would allow you to affect the request, but only at certain points along the way...
php,mongodb,aggregation,pipeline
It's true that the "standard projection" operations available to MongoDB methods such as .find() will only return at most a "single matching element" from the array to that is queried by either the positional $ operator form in the "query" portion or the $elemMatch in the "projection" portion. In order...
powershell,parameters,pipeline
The Quick Answer if (Test-Path variable:\object) This will be true even if the parameter is not passed, so it's always true as long as you have $object as a parameter possibility. To fix it: You can do a few things. The easiest is just to use: if ($object) Just like...
Use a to append, your are overwriting each time using w so you only get the last piece of data: rss.write_xml(open("pyrss2gen.xml", "a")) If you look at the original code you can that also uses a not w. You might want to use with when opening files or at least closing...
c#,asp.net,pipeline,httpcontext
Current is the property, not field, so it's a static method actually. This method can return different instances for different threads, and it really does. If you're developing multithread web application, keep in mind a few things. Don't use ThreadStaticAttribute. It works in Windows and console applications, but it may...
There are several reason Validate will fail, including: The Schema is not a Flat File Schema The fully qualified name cannot be resolved (if it's in a references Assembly, you may have Rebuild, then close and open Visual Studio) You can't ignore this error, otherwise, the project will not Build....
c++,multithreading,performance,io,pipeline
17ms per record is extremely high, it should not be difficult to improve upon that, unless you are using some seriously antiquated hardware. Upgrade the hardware. SSD's, RAID striping and PCI express hard disks are designed for this kind of activity. Read the file in larger chunks at a time,...
python,scrapy,settings,pipeline
Since a dictionary in Python is an unordered collection and ITEM_PIPELINES has to be a dictionary (as a lot of other settings, like, for example, SPIDER_MIDDLEWARES), you need to, somehow, define an order in which pipelines are applied. This is why you need to assign a number from 0 to...
As far as I see it, there is no need for you to override the ImagesPipeline, because you are not modifying its behavior. But, since you are doing it, you should do it properly. When overriding ImagesPipeline, two methods should be overriden: get_media_requests(item, info) should return a Request for every...
Based on the little information I think your best bet is to ensure server.php does only one thing: opening a socket (e.g. on port 8001) and serve. It shouldn't contain any methods at all. You put your methods in another script, let's call it handler.php. In server.php you include it:...
assembly,arm,pipeline,instruction-set
The answer is right there in the question: between 1 and 3 cycles depending on things. Even on something as relatively simple as Cortex-M4 there are enough factors that it's not necessarily possible (or useful) to specify some hard-and-fast rule. However, that's not to say we can't do a bit...
python,machine-learning,scikit-learn,pipeline
Following the same method as you are describing, namely doing feature selection and classification with two distinct Random Forest classifiers grouped into a Pipeline, I ran into the same issue. An instance of the RandomForestClassifier class does not have an attribute called threshold. You can indeed manually add one, either...
mysql,delimiter,pipeline,text-formatting
use the string replace function of mysql SELECT REPLACE('www.mysql.com', 'w', 'Ww'); -> 'WwWwWw.mysql.com' or check out another function you could need: https://dev.mysql.com/doc/refman/5.0/en/string-functions.html#function_replace In your case you might want to replace | with \n or \r. SELECT REPLACE(column, '|', '\r');...
If you can't find a Powershell cmdlet that will do exactly what you want, you can always roll your own: function paste ($separator = ',') {$($input) -join $separator} & { echo foo; echo bar; echo baz; } | paste foo,bar,baz ...
gstreamer,pipeline,java-gstreamer
The correct way would be to send the EOS event to the pipeline and wait until you get it as a GstMessage on the bus. If you say you have tried that and it didn't work it could be a bug in the involved elements (unlikely), in the java bindings...
these processors are used when you upload files into Sitecore Media Library. CheckPermissions processor is checking permisions for the folder where you upload files. If you don't have permission is aborting the upload. CheckSize processor is checking if the size of every file uploaded is greater than Media.MaxSizeInDatabase value from...
bash,pipe,named-pipes,pipeline,io-redirection
The syntax is not > >, but > >(). That's a process substitution. Basically bash creates a named pipe with standard input on the process inside the pipe available as effectively a file to the process outside. Whatever gets redirected to the pipe will be read by the process inside...
MIPS has a five-stage pipeline. An effect is that the instruction after a branch will be performed regardless of whether the branch is taken. In simplified terms, compared to traditional assembly languages you need to move the branch up one instruction earlier. So if the branch is taken then code...
Make sure that the ls you are running from the shell and the ls that is running in your program are the same program. Your program is specifying /bin/ls as the program to run; you can find out what is being run when you type the command at the shell...
bash,loops,while-loop,stream,pipeline
The problem that you are running into is that, under bash (other may shells differ), a pipeline terminates only after all commands in the pipeline are finished. One solution is to use: while [ 1 ]; do cat pattern_file || break done | socat - /dev/ttyS0 If socat terminates, then...
I started from scratch and the following spider should be run with scrapy crawl amazon -t csv -o Amazon.csv --loglevel=INFO so that opening the CSV-File with a spreadsheet shows for me Hope this helps :-) import scrapy class AmazonItem(scrapy.Item): rating = scrapy.Field() date = scrapy.Field() review = scrapy.Field() link =...
You could use the -match operator on the entire resulting collection: $ci = Get-ChildItem | Select-Object -ExpandProperty "Name" $ci -match "Desktop" Now, the last statement will return all strings that match "Desktop". If no match is found, nothing is returned. So now we can do (in PowerShell 3.0 and above):...
powershell,encoding,pipe,pipeline
Try using set-content: create-png | set-content -path myfile.png -encoding byte If you need additional info on set-content just run get-help set-content You can also use 'sc' as a shortcut for set-content. Tested with the following, produces a readable PNG: function create-png() { [System.Drawing.Bitmap] $bitmap = new-object 'System.Drawing.Bitmap'([Int32]32,[Int32]32); $graphics = [System.Drawing.Graphics]::FromImage($bitmap);...
javascript,css,ruby-on-rails,pipeline,assets
In the app/assets/stylesheets/application.css it should be *= require jquery.nouislider and not //= require jquery.nouislider and try to replace Gemfile declarations with; gem 'jquery-nouislider-rails', '~> 4.0.1.1' specifying the version and restart server Working setup I simple downloaded nonuislider here and added it to app/assets/javascripts folder then called it in application.js as...
For a start, cat $(ls) is not the right way to go about this - cat * would be more appropriate. If the number of files is too high, you can use find like this: find -exec cat {} + This combines results from find and passes them as arguments...
powershell,delegates,pipeline,scriptblock
That'll work. You've just got some syntax issues with your function definitions and how you're passing the parameters: Function Get-Square{ [CmdletBinding()] Param([Parameter(ValueFromPipeline=$true)]$x) $x*$x } Function Get-Cube{ [CmdletBinding()] Param([Parameter(ValueFromPipeline=$true)]$x) $x*$x*$x } Function Get-Result{ [CmdletBinding()] Param([Parameter(ValueFromPipeline=$true)]$x,$Cmdlet) $x | . $cmdlet } 10 | Get-Result -Cmdlet Get-Square 10 | Get-Result -Cmdlet Get-Cube 100...
You are right. The first chart has two instructions being fetched during the second cycle. Unless specified otherwise, this cannot be done. There are circumstances in which this is allowable: The instruction fetch is divided into two stages, IF1 and IF2, each of which take 1 cycle. IF1 and IF2...
As Stefan mentioned, you did a little mistake in second case. Proper way of using "ARGF.gets" approach in your case will look like: while input = ARGF.gets # input here represents a line end If you rewrite the second example as above, you will not have difference in behavior. Actual...
powershell,wix,powershell-v3.0,pipeline
A couple of issues: Note ValueFromPipeline instead of ValueFromPipelinebyPropertyName Function GetGUIDFrom-Wix { [CmdletBinding()] [OutputType([string])] Param ( # Param1 help description [Parameter(Mandatory=$true, ValueFromPipeline=$True, Position=0)] [ValidateScript({Test-Path $_ })] [string] $Path ) Also for function SetGUIDTo-Wix Function SetGUIDTo-Wix { [CmdletBinding()] [OutputType([string])] Param ( [Parameter(Mandatory=$true, ValueFromPipeline=$false, Position=0 )] [ValidateScript({Test-Path $_ })] [string] $Path,...
By calling this: dup2(pipefd[1], 1); close(pipefd[1]); in the child process you are closing the already closed pipefd[1], so close(pipefd[1]); has no effect. You should also close pipefd[0] in the child process. Same applies to the parent process. So, your code should be edited as below: int pipefd[2]; pipe(pipefd); int id...
arrays,powershell,object,syntax,pipeline
Short answer: use unary array operator ,: ,$theArray | foreach{Write-Host $_} Long answer: there is one thing you should understand about @() operator: it always interpret its content as statement, even if content is just an expression. Consider this code: $a='A','B','C' [email protected]($a;) [email protected]($b;) I add explicit end of statement mark...
In this section: $Script += { $SplitJobQueue = $($Input) & { trap {continue} while ($SplitJobQueue.Count) {$SplitJobQueue.Dequeue()} } | }.ToString() + $Scriptblock There doesn't appear to be any reason for that pipe to be there....
Use $first and your aggregation pipeline query as below : db.collectionName.aggregate({ "$match": { "user.statuses_count": { "$gt": 99 }, "user.time_zone": "Brasilia" } }, { "$sort": { "user.followers_count": -1 // sort followers_count first } }, { "$group": { "_id": "$user.id", "followers": { "$first": "$user.followers_count" //use mongo $first method to get followers count...
Before the line that says: if test $E -ge $I temporarily place the line: echo "[$E]" and you'll find something very much non-numeric, and that's because the output of df -k looks like this: Filesystem 1K-blocks Used Available Use% Mounted on /dev/sdb1 954316620 212723892 693109608 24% / udev 10240 0...
Thank you everyone for you comments and answers. Unfortunately, there were a couple of things that I did not explicitly mention in my question, but they were quite relevant to the actual problem I was trying to solve (at the time of asking I was not aware of that). Obviously,...
Django-pipeline sends signals whenever it compiles package, you can read more about this in docs, and about signals in general here. You can hook this signal like this: from pipeline.signals import js_compressed def clear_files(sender, **kwargs): print kwargs if 'package' in kwargs: print kwargs['package'].sources # here remove unwanted files js_compressed.connect(clear_files) ...
camera,streaming,gstreamer,pipeline
This might depend on your OS/distribution and GStreamer version. Over here (Debian jessie, GStreamer 0.10.36) gst-inspect ffdec_h264 gives the following output: Factory Details: Long name: FFmpeg H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 decoder Class: Codec/Decoder/Video Description: FFmpeg h264 decoder Author(s): Wim Taymans <[email protected]>, Ronald Bultje <[email protected]>,...
java,apache,apache-camel,pipeline
Take a look at CamelProxy. It allows you to send to a Camel endpoint. OrderService service = new ProxyBuilder(context) .endpoint("direct:order") .build(OrderService.class); OrderService is an interface which defines the methods you want to use to send: public interface OrderService { public String send(SomeBean message); } Sample route: from("direct:order").to("bean:someProcessor"); Send a message...
Pipelines A pipeline in the template package refers to the same sort of "piping" you would do at the command line. For example, this is one way to get the default gateway assigned to your NIC on a Mac: route -n get default | grep 'gateway' | awk '{print $2}'...
linux,bash,shell,pipeline,netcat
awk is buffering when its output is not going to a terminal. If you have GNU awk, you can use its fflush() function to flush after every print gawk '{print; fflush()}' <&5 | grep foo In this particular case though, you don't need awk and grep, either will do. awk...
python,scikit-learn,pipeline,feature-selection
Unfortunately, this is currently not as nice as it could be. You need to use FeatureUnion to concatenate to kinds of features, and the transformer in each needs to select the features and transform them. One way to do that is to make a pipeline of a transformer that selects...
I guess what you want to achive is done like this. List<Response> responses = new ArrayList<>(); Pipeline p = jedis.pipelined(); for (int id: ids) { records.add(p.get(id)); } p.sync(); for(Reponse response : responses){ Object o = response.get(); } ...
google-app-engine,google-bigquery,google-cloud-storage,pipeline
You can use the HTTP error code from the exception. BigQuery is a REST API, so the response codes that are returned match the description of HTTP error codes here. Here is some code that handles retryable errors (connection, rate limit, etc), but re-raises when it is an error type...
After talking to some people, I realized I was overthinking this. The solution is simpler than I was imagining. To pipe through commands, just do: public function whateverMethod(Dispatcher $dispacher) { $dispached->pipeThrough([]) // arrays with commands } Dispacher comes through beautiful laravel 5 method injection!...
python,machine-learning,scipy,scikit-learn,pipeline
You can write your own transformer that'll transform input into predictions. Something like this: class PredictionTransformer(sklearn.base.BaseEstimator, sklearn.base.TransformerMixin): def __init__(self, estimator): self.estimator = estimator def fit(self, X, y): self.estimator.fit(X, y) return self def transform(self, X): return self.estimator.predict_proba(X) Then you can use FeatureUnion to glue your transformers together. That said, there's a...
c++,templates,c++11,pipeline,partial-specialization
You could use tag dispatching to avoid the need for partial specialization. A simplified version: //we'll use this to overload functions based on a size_t template param template <size_t N> struct Size2Type{}; class PipeOutputClass { public: template <size_t N> auto getOutput(size_t c_Idx) const { //use Size2Type to tag dispatch return...
arrays,function,powershell,hashtable,pipeline
function bcd () { Param([parameter(ValueFromPipeline=$true)][Hashtable[]]$table) Begin {$tables= @()} Process {$tables += $table} End {$tables.count} } @{ a = 10 }, @{ b = 20 }, @{ c = 30 } | bcd bcd -table @{ a = 10 }, @{ b = 20 }, @{ c = 30 } 3...
A common assumption is that you can write in the first half of a cycle, and read in the second half of a cycle. Lets say than I1 is your first instruction and I2 your second instruction, and I2 is using a register that I1 is modifying. Only 1 memory...
I solved that task by using 2 Jenkins extensions: https://wiki.jenkins-ci.org/display/JENKINS/EnvInject+Plugin https://wiki.jenkins-ci.org/display/JENKINS/Build+Flow+Plugin Create properties file from test. File contain property, that indicate result of test step status With EnvInject Plugin add new step into Jenkins Job (step must be after test run) and inject parameter value (from file created at first...
You can add newlines after the pipe, and bash will continue to see it as a single pipeline: foo | bar | baz | qux can be written as foo | bar | baz | qux Or, use line continuations, if the look appeals more: foo \ | bar \...
redis,pipeline,stackexchange.redis,batching
On performance: have you timed them? Other than that: both work, and have different trade-offs; the latter is synchronous, for example - bit benefits from avoiding all of the TPL overheads and complications. You might also want to consider a third option - a Lua script that accepts and array...
You can use this one. let map funcs vals = funcs |> List.map (Array.map >> ((|>) vals)) The part Array.map >> ((|>) vals) partially applies f to Array.map and then composes it with the application of vals....
pipeline,instructions,pipelining
See all instructions in a single cycle non pipelined structure do not necessarily take same amount of time rather the next instruction to be executed after an instruction can not start until the next clock cycle ,current instruction may complete before the current cycle because cycle length is determined by...
python,mysql,web-scraping,scrapy,pipeline
The error is happening while you are making a SELECT query. There is a single placeholder in the query, but item['title'] is a list of strings - it has multiple values: self.cursor.execute("SELECT title, url FROM items WHERE title= %s", item['title']) The root problem is actually coming from the spider. Instead...
scala,functional-programming,akka,pipeline,scalaz
If your pipeline is very much like a chain of method calls, use a chain of method calls! There's no point making the solution more complicated than it needs to be; if it's well-modelled by a chain of methods calls, just use that. (Or functions, which you can compose.) If...
This will do what you are looking for: tr '[:upper:][:lower:]' '[:lower:][:upper:]' ...
r,alias,plyr,pipeline,magrittr
use_series is just an alias for $. You can see that by typing the function name without the parenthesis use_series # .Primitive("$") The $ primitive function does not have formal argument names in the same way a user-defined function does. It would be easier to use extract2 in this case...
cut and paste. Easy for tab delimited files. cut -f1 file > file1 cut -f2- file | clean.sh > file2 paste file1 file2 > clean-file For an input stream version, is this cheating? ... | | (cat > foo ; paste <(cut -f1 foo) <(cut -f2- foo | ./clean.sh)) ...
You can just use ConvertFrom-Csv instead: Get-Content F:\dat| Where-Object {$_ -like "1,*"}| ConvertFrom-Csv -Header Name,Ignore1,Ignore2,Ignore3,Ignore4,Value| Select-Object Name,Value ...
pipeline,apache-kafka,amazon-redshift,data-partitioning
Assuming: you have multiple interleaved sessions you have some kind of a sessionid to identify and correlate separate events you're free to implement consumer logic absolute ordering of merged events are not important wouldn't it then be possible to use separate topics with the same number of partitions for the...
Loose the space between "image" and "freeze" and you should be good.
build,jenkins,jenkins-plugins,pipeline
a) Build pipeline does not have the functionality to show promotion stars in it. b) The way you have passed the parameters are correct. It should work when you use the ${iso.name} on build steps. But if you use this in a 'Execute batch command step' It will not work....
function,powershell,csv,export-to-csv,pipeline
If you want to accept pipeline input from advanced function, then it have to have parameters, which declared as ValueFromPipeline or ValueFromPipelineByPropertyName and does not bound from command line. function Out-Excel { param( [Parameter(Mandatory=$True,HelpMessage="Enter Prefix")] [string]$Prefix = $null, [Parameter(Mandatory=$True,ValueFromPipeline=$True)] [psobject]$InputObject = $null, $Path = "$env:temp\$Prefix`_$(Get-Date -Format yyMMdd_HHmmss).csv" ) begin{ $List=New-Object...
c,gstreamer,sample,pipeline,sink
I found the solution for my problem. I simply had the wrong signature for my function static GstFlowReturn new_sample (GstElement *appsink, AllElements *element) and I now use gst_base_sink_get_last_sample(GST_BASE_SINK(appsink)); to get the sample....
vb.net,ffmpeg,pipeline,video-encoding
Since you know your command string works on the command line, the easiest thing to do would be to let cmd.exe run the code for you. As Plutonix suggested in his comment, in this answer, Mark provides an example of how to do this in C# code. Process test =...
bash,shell,unix,pipeline,subshell
Problem is use of pipe here, which is forking a sub-shell for your while loop and thus changes to DIRS are being made in the child shell that are not visible in the parent shell. Besides cat is unnecessary here. Have it this way: #!/bin/bash while read -r line do...
function,variables,powershell,pipeline
$cmd.Parameters.AddWithValue() echoes the added parameter, and PowerShell functions return the entire non-captured output on the success output stream, not just the argument of the return keyword. Quoting from about_Return (emphasis mine): SHORT DESCRIPTION Exits the current scope, which can be a function, script, or script block. LONG DESCRIPTION The Return...
Unless variable declared in global scope, functions/ScriptBlocks can not see variables declared in module different from its own module. As workaround, you can create ScriptBlocks thru [scriptblock]::Create, which create ScriptBlocks not bounded to any particular module: function FunctionWithoutModule{ [CmdletBinding()] param([parameter(ValueFromPipeline=$true)]$InputObject,$sb,$ArgumentList) process{ $SomeVariable='SomeValue' &$sb $ArgumentList } } $Module=New-Module -ScriptBlock { function...
According to the relevant part of the Scrapy's data flow: The Engine sends scraped Items (returned by the Spider) to the Item Pipeline and Requests (returned by spider) to the Scheduler In other words, return/yield your item instances from the spider and then use them in the process_item() method of...
jenkins,jenkins-plugins,pipeline
Promoted Builds Plugin https://wiki.jenkins-ci.org/display/JENKINS/Promoted+Builds+Plugin You can use the Promoted Builds Plugin which has a manual promotion workflow. You could have: [Project] --> [Project Deploy Test] --> [Project UA Test] [Project UA Test] --(manual promotion)--> [Project Deploy Prod] Explanation: business as usual until the user acceptance tests are complete. When complete,...
command-line,gstreamer,pipeline
It is not linked because your launch line doesn't do it. Notice how the lamemp3enc element is not linked downstream. Update your launch line to: gst-launch filesrc location=surround.mp4 ! decodebin name=dmux ! queue ! audioconvert ! lamemp3enc ! mux. dmux. ! queue ! x264enc ! mpegtsmux name=mux ! queue !...