Yes, any of those hash algorithms give a uniform hash code where each bit isn't supposed to carry more information than any other. You can just take any 4-5 bytes of it (as long as you take the same bytes from each code) and use as a smaller hash code....
algorithm,image-processing,compression
I don't know how any of these compression algoritms work, but I'll take a general stab at it. Compression works by removing redundancy and generally uncompressed images have lots of it, because neighbouring pixels are correlated. That is, the colour in an area of an image tends to be similar...
c#,json,asp.net-web-api,compression,httpresponse
In case anyone doesn't want to read all the comments the answer came from Jerry Hewett (jerhewet). Namely that anti-virus software intercepts the response before it gets to the browser. The anti-virus software decompresses the data, no doubt as part of the scanning process. Huge thanks to Jerry for his...
c#,stream,compression,tpl-dataflow
As always, this is a tradeoff and the decision can only be made by you. I would go with the simplest solution which is just to connect both blocks, and the let the writing block "absorb" the added compression complexity since TDF blocks can increase their parallelism when they need...
I found a way for myself to compress filesize of still image without shrinking or quality degrade: mspaint.exe Open any file with mspaint.exe for windows. Rightclick file and click Edit Just save the file as it is ctrl + s That's compressed. Yaay! Advantages: Very fast No noticeable quality degrade...
c#,compression,hive,avro,hdinsight
Hive does support Avro out of the box: https://cwiki.apache.org/confluence/display/Hive/AvroSerDe You can define Avro tables in Hive similarly to: CREATE TABLE kst PARTITIONED BY (ds string) ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe' STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat' TBLPROPERTIES ( 'avro.schema.url'='http://schema_provider/kst.avsc'); Or as of Hive 0.14, similarly to: CREATE TABLE kst ( string1...
c#,stream,compression,gzipstream
If there's any way you can get the database result into a string and then load it into a MemoryStream you should be alright: var databaseResult = "<xml>Very Long Xml String</xml>"; using (var stream = new MemoryStream()) { using (var writer = new StreamWriter(stream)) { writer.Write(databaseResult); writer.Flush(); stream.Position = 0;...
json,compression,android-gcm,gzip,chrome-gcm
If data is that big then you should use GCM only as informer to the application that it needs to pull some updated data from server. So use a tickle message to notify the device that there is something new at server end. When you receive that tickle message from...
java,compression,tar,lossless-compression,xz
What you are seeing is entirely expected. Data can be compressed only if it has redundancy that can be detected and exploited. Audio and video files are already compressed. There is no redundancy in them for xz to exploit. There is plenty of redundancy in text files to exploit.
for cropping image you can use this lib and for compressing the image you can use this code also. public Bitmap decodeFile(String filePath) { BitmapFactory.Options o = new BitmapFactory.Options(); o.inJustDecodeBounds = true; BitmapFactory.decodeFile(filePath, o); // The new size we want to scale to final int REQUIRED_SIZE = 1024; //...
php,zend-framework2,compression
If I got this right then what's happening is that you're setting the archive content to be the string /local/lnx/file.txt and not the file that resides under that path. Take a look at the method definition. It looks like you need to get the file contents (e.g. with file_get_contents function)...
You can use the following code: foreach(var zipPath in Directory.GetFiles("C:\\Test")) { using (ZipArchive archive = ZipFile.OpenRead(zipPath)) { foreach (ZipArchiveEntry entry in archive.Entries) { var position = entry.Name.IndexOf(filetosearch , StringComparison.InvariantCultureIgnoreCase); if (position > -1) { listView1.Items.Add(entry.Name); } } } } The code gets all the files in the directory and iterates...
LZW encoding technically works; you'll just need to convert the LZW-encoded binary into URL-safe base64, so that the output doesn't contain special characters. Here's an MDN article on base64 in JavaScript; the URL-safe variant of base64 just replaces + with - and / with _. Of course, you're not likely...
php,html,wordpress,compression,minify
I found a simple solution using regular expression: ob_start( function ( $output ) { return preg_replace( '/(?:(?:\r\n|\r|\n)\s*){1}/s', "\n", $output ); }); Pasting this at the top of Wordpress' index.php file does the job....
javascript,node.js,compression,zip,zlib
zlib.inflate decodes the zlib format, not the zip format. There are tools out there for the zip format.
optimization,compression,png,gulp,gulp-imagemin
You can check out what the various algorithms have done by using ImageMagick's identify -verbose tool, like this identify -verbose com.png > com.txt identify -verbose tin.png > tin.txt identify -verbose gulp.png > gulp.txt and then compare the outputs - I use opendiff on the Mac. You will see this if...
There is no overhead (or may be you meant metadata) added by the function fwrite in Matlab). The function is as "low level" as it comes and on a given machine it will give similar results than the equivalent low level functions in C, C++ and many more languages. To...
multithreading,compression,zlib
It is possible to have multiple threads compressing data simultaneously, as long as each thread has its own separate z_stream object. Each z_stream object should have deflateInit() called on it, then as many calls to deflate() as necessary, and then deflateEnd() called after all of the uncompressed data has been...
javascript,compression,minify,google-closure-compiler,yui-compressor
This is covered is Closure Compiler FAQ. https://github.com/google/closure-compiler/wiki/FAQ#closure-compiler-inlined-all-my-strings-which-made-my-code-size-bigger-why-did-it-do-that Closure Compiler assumes that you are using gzip compression. If you do not, you should. Configuring your server to gzip your code is one of the most effective and easiest optimizations that you can possibly do. The gzip algorithm works by trying...
I think you are getting correct results. couchdb stores documents in chunks of 4kb each (can't find a reference at the moment but you can test it out by storing an empty document). That is min size of a document is 4kb. Which means that even if you store a...
People who recommend this procedure rarely do so from a background of expertise or knowledge -- it's rather based on gut feelings. The detour of generating a new PDF via PostScript and back (also called "refrying a PDF") is never going to give you the optimal results. Sometimes it is...
string,random,compression,pi,xz
It's a matter of information density. Compression is about removing redundant information. In the string "314159", each character occupies 8 bits, and can therefore have any of 28 or 256 distinct values, but only 10 of those values are actually used. Even a painfully naive compression scheme could represent the...
asp.net,asp.net-mvc,caching,compression,asp.net-5
As "agua from mars" states in the comments, if you're using IIS you can use IIS's static file handling, in which case you can use the <system.webServer> section in a web.config file and that will work as it always did. If you're using ASP.NET 5's StaticFileMiddleware then it has its...
ios,image-processing,compression
You can use ALAssetsLibrary's -writeImageDataToSavedPhotosAlbum:metadata:completionBlock: method to save image data to library. ALAssetsLibrary *al = [[ALAssetsLibrary alloc] init]; [al writeImageDataToSavedPhotosAlbum:data metadata:nil completionBlock:^(NSURL *assetURL, NSError *error) { NSLog(@"Completion block/Do check the error if any"); }]; Try it and see ...
apache,.htaccess,compression,server
There is no default empty .htaccess. You can add .htaccess file in any site directory, depending on your needs. It's just a simple text file, that must be saved using the name .htaccess. As the name starts with a dot the file is not visble from outside (online). If it...
linux,git,ubuntu,compression,tar
Option '--exclude' must be at the beginning of the tar command. And do not forget quotation marks. Try this: tar --exclude='slice/media*' --exclude='slice/images/doc/*' -zcvf slice.tar.gz slice...
php,mysql,string,algorithm,compression
gzcompress compresses the given string using the ZLIB data format. For details on the ZLIB compression algorithm see the document ZLIB Compressed Data Format Specification version 3.3 (RFC 1950) You can read one explanation of how ZLIB works in: http://www.zlib.net/feldspar.html <?php $compressed = gzcompress(' This function compresses the given string...
I completed this task using the following code: $quality = 90; while(filesize($full_path) > 1048576 && $quality > 20) { $img = imagecreatefromjpeg($full_path); imagejpeg($img,$full_path,$quality); $quality = $quality - 10; clearstatcache(); } if(filesize($full_path) > 1048576) {echo "<p>File too large</p>"; unlink($full_path); exit;} The $quality > 20 part of the statement is to keep...
Even .net's GZipStream, which does not support multiple members (contrary to the spec btw), nevertheless supports gzip files with more 4GB, now that (since .net 4.0) the underlying DeflateStream supports it. So that would seal it: Multiple gzip members are NOT necessary for input greater than 4GB. The gzip specs...
ios,xcode,sprite-kit,compression,png
Does iOS "likes" more png-24? iOS certainly likes its images to be close to its own hardware format (see below). However, it may not presume a certain format, or convert images at will. This would mean that the default postprocessing could convert images from palettized (8-bit) to true-color images,...
Finally fixed the issue. The problem was that I had to "unblock" the jpegtran.exe file. That's it. ...
Yes, Apple will allow apps of any size on the App Store. As a point of reference, the game Infinity Blade III is 1.92GB. But keep in mind that apps (or their updates) over 100MB cannot be downloaded over cellular, so if it's possible to structure the app in such...
java,string,file,stream,compression
I solved this by encoding and decoding the bytes with Base64.
arrays,performance,compression
Look up "sparse array". If access speed is important, a good solution is a hash table of indices. You should allocate about 2x the space, requiring a 180 GB table. The access time would be O(1). You could have just a 90 GB table and do a binary search for...
php,html,compression,gzip,zlib
This function read write compress small chunks you can use this function to compress big files /** * @return bool * @param string $source * @param string $dest * @desc compressing the file with the zlib-extension */ function gzipCompress($source, $dest, $level = 5){ if($dest == false){ $dest = $source.".gz"; }...
So I went back a bit and guess that the © is taking 2 chars. Then I var_dump(substr('0©',1,2)); and I confirmed my suspicion. The problem was that I was croping the string with substr('0©',i,2); Problem solved. $letter = mb_substr($text,$i,2,'UTF-8'); solved my problem...
Did you configure Snappy. Verify first snappy is loaded in all the nodes. To verify please use this command. hbase org.apache.hadoop.hbase.util.CompressionTest hdfs://host/path/to/hbase snappy Once snappy test is successful. The mentioned above compression should work. For more detail about configuration and installation of snappy: http://hbase.apache.org/0.94/book/snappy.compression.html...
java,compression,nodes,huffman-coding
You can use indices instead of nodes but you need somewhere nodes and/or vertices.
database,compression,internals,voltdb
VoltDB is designed for maximum throughput & transaction processing performance, so it does not compress data for tables, views, or indexes stored in memory. We have a blog post on database sizing, or how you can estimate the size of RAM needed for a use case and schema....
c#,.net-3.5,zip,compression,zipstorer
You can use (back)slashes for the _filenameInZip (sic) parameter to add files in a directory in the zip: zip.Addfile(,,"directory/filename.txt",); Or zip.Addfile(,,"directory\\filename.txt",); ...
image,compression,cloud,storage
Look into HayStack for large applications, uploading into a file system then storing into a DB would be sufficient for smaller applications even up to millions. However, larger companies such as flickr use haystack objects.
c,compression,zlib,decompression,deflate
When deflating, you are setting the same dictionary every CHUNK input bytes. Why? You should use deflateSetDictionary() once, right after deflateInit2(). From there on, the input data itself should serve as a better source of matching strings than a dictionary you might provide. On the inflating side, you would have...
compression,glsl,bit-shift,glsles,bit-packing
Well, bit shifting can be represented by mulitplication (for left shifts) or division (for right shifts) with powers of two. You just have to take into account that the floats will stroe the fractional parts which would normally be shifted "out" in normal integer bitshifts. So to pack 4 normalized...
c++,visual-studio-2013,compression,dicom,dcmtk
Problem might as well be in how you are populating Pixel Data (in native format) when creating the uncompressed dataset. Uncompressed image should be encoded into Pixel Data Element itself and not encapsulated into 2nd Item Element and so on. If you are missing the Sequence Delimitation Item, it is...
As is pointed out in comments, the optimal encoding -- if all permutations are equally probable -- is to replace the entire permutation with its index in the enumeration of permutations. Since there are n! possible permutations, the index requires log2n! bits, and therefore the compression ratio from the naive...
To ensure that your histc call does count the amount of x values per unique x value call it as h=histc(x,linspace(xmin,xmax,numel(unique(x(:)))); Else, if you rimage is binary and your only values are 0 and 255, histc will return a size(h)=256 size array with lots of zeroes, because xmin:xmax is 0:255=[0...
image,matlab,compression,communication
If there are no constraints on the 256 bits, then it will take 256 bits to transmit. It cannot be compressed in all cases. End of story.
dictionary,compression,lzw,text-compression
LZW is one quite specific compression algorithm, which was a significant milestone in the history of compression algorithms, but more due to its relative simplicity and speed than due to its compression ratio. LZW also has the advantage that it is a single-pass algorithm, making it a good choice for...
My favorite online compressor is - http://refresh-sf.com/ it will do the merge for you.
Since all you can do is edit your source code, the only way to reduce the size of your executable is to find ways to consolidate stuff. Things to look out for: Find dead code and resources. Delete all functions/methods/variable that are not used. Find duplicate code and data. For...
If you just want to add a header you can always add a GZip ContentEncoding Response Header with a Response Filter, e.g: GlobalRequestFilters.Add((req, res, dto) => res.AddHeader(HttpHeaders.ContentEncoding, CompressionTypes.GZip)); But ServiceStack only compresses cached responses itself i.e. when using ToOptimizedResult() API's or returning a responses in a CompressedResult, e.g: public object...
algorithm,math,integer,compression
First, you could just use any existing compression algorithm, via some library. However knowing that your input is very specialized, you can also write a special algorithm adapted yo your case. But let's first analyze how much you can compress this input. To simplify, I'll first consider compressing exactly 12...
linux,command-line,terminal,compression,tar
You should use --exclude (not -x) and repeat this long option before every pattern, like: tar -zcvf master16march.tar.gz master --exclude "master/media/com_easysocial/photos/*" --exclude "master/media/com_easysocial/avatars/*" --exclude "master/media/com_easysocial/tmp/*" --exclude "master/media/com_easydiscuss/attachments/*" --exclude "master/images/joomcareer/*" Alternatively you can use -X to point to a file containing exclude patterns. If you want to exclude folders entirely (not...
string,scala,compression,byte,zlib
When you've got arbitrary binary data, never ever try to convert it to a string as if it's actually text data which has been encoded into binary data using a normal encoding such as UTF-8. (Even when you do have text data, always specify the encoding when calling the String...
c#,sql-server-2005,compression,gzip,binary-data
Raw deflate-compressed data begins 32 bytes in (starting with the ec 7c). You can use zlib to decompress it.
compression,tar,rar,winrar,winzip
WinRAR compresses by default each file separately. So there is no real gain on compressing a folder structure with many similar or even identical files by default. But there is also the option to create a solid archive. Open help of WinRAR and open on Contents tab the item Archive...
You can use this function to create the reduced Images: public static Image ShrinkImage(Image original, int scale) { Bitmap bmp = new Bitmap(original.Width / scale, original.Height / scale, original.PixelFormat); using (Graphics G = Graphics.FromImage(bmp)) { G.InterpolationMode = InterpolationMode.HighQualityBicubic; G.SmoothingMode = SmoothingMode.HighQuality; Rectangle srcRect = new Rectangle(0,0,original.Width, original.Height); Rectangle destRect =...
c++,algorithm,vector,compression
Use a hash function instead of compression. Generate a small hash, e.g. 32 bits, from the 100 elements of each vector. Then when you get a new vector, compare the hash of the new vector with the hashes of all of your old vectors. If the hashes are not equal,...
compression,huffman-coding,run-length-encoding
If RLE works, Huffman will work all the better. There's a proof that if your file is large enough, Huffman will converge to the maximum possible entropy, thus maximizing compression.
java,web-services,rest,compression,gzip
If you are using Spring boot and Tomcat You should be able to accomplish this via Tomcat Configuration: http://docs.spring.io/spring-boot/docs/current-SNAPSHOT/reference/htmlsingle/#how-to-enable-http-response-compression Here is a similar POST Using GZIP compression with Spring Boot/MVC/JavaConfig with RESTful It's as simple as follows: server.tomcat.compression=on server.tomcat.compressableMimeTypes=application/xml ...
javascript,automation,compression,partials
Javascript is directly interpreted by your browser so there is no "partials precompiling" stuff. What you are looking for is a task runner like Gulp or Grunt that can launch a task (amongst others) that will concatenate your files. Here is a concat task for Gulp...
mysql,nosql,compression,couchdb
CouchDB trades disk-space for read/write speed. It's very likely a comparable CouchDB database will take up more disk-space than MySQL. That being said, there are a number of things you can do to conserve disk-space: Database file compression is available, (as you pointed out) but you should probably experiment with...
java,hadoop,compression,apache-spark,snappy
I am the developer of snappy-java. Hadoop's SnappyCodec is not exactly same with the Snappy's format specification: https://code.google.com/p/snappy/source/browse/trunk/format_description.txt SnappyCodec in Hadoop extends this format to compress large data streams. Data is split into blocks (via BlockCompressionStream), and each block has some header and compressed data. To read the compressed data...
php,validation,upload,compression
Just like what georg said, I can simply use zip or rar function to list what's in the archive and extract the files in it.
Compression and file formats are completely different things. A file format describes the structure of data stored in a file. Avro will contain Avro serialized objects, SequenceFile will contain a key (usually a number) and a value (the original data). Parquet is a special file format which allows columnized storage...
c++,visual-studio-2013,compression,dicom,dcmtk
Well, maybe I have solved the problem. I have added the istructions dataset->removeAllButCurrentRepresentations(); before accessing to compressed pixel data. I can also do PixelData->removeAllButCurrentRepresentations(); instead of the instruction before and works in the same way.... But I really don't understand why it work.... can you please try to explain? Thanks...
python,pdf,python-3.x,compression,ghostscript
I fixed the issue following @KenS suggestion of giving the output file a different name and this worked. from __future__ import print_function import os import subprocess for root, dirs, files in os.walk("C:\comp"): for file in files: if file.endswith(".pdf"): filename = os.path.join(root, file) print (filename) arg1= '-sOutputFile=' + "c" + file...
AAC is one of many audio codec which is the flavor of compression ... once encoded it becomes binary data which needs to get wrapped inside a container format for transport over the wire or into a file ... once such container is m4a Doing the math on 64 kbps...
"Random" access is not good on a .tar.gz, since that is a .tar file that has been wrapped in a .gz compression, so to get to things in the .tar file, you'd first have to decompress the .tar file. It would be possible to use a .tar file that contains...
amazon-web-services,compression,hive,snappy
OrcFiles are binary files that are in a specialized format. When you specify orc.compress = SNAPPY the contents of the file are compressed using Snappy. Orc is a semi columnar file format. Take a look at this documentation for more information about how data is laid out. Streams are compressed...
compression,gzip,huffman-coding
infgen will show you the dynamic block headers in some detail. infgen -d will show you them in all their detail. I don't know that that will help with what you are trying to do. It sounds like what you're looking for are preset dictionaries. In zlib you can use...
javascript,performance,compression,minify,uglifyjs2
It does indeed cause JIT issues on some engines. A much simpler benchmark that doesn't conflate a dozen things demonstrates this (below). So you can do it, but you have to accept the performance cost, which is significant on some engines (reduces property access speed by ~64% on Chrome, for...
I don't see the problem. You should be able to use e.g. the gzip module just fine, something like this: inf = open("mydata.txt", "rb") outf = gzip.open("file.txt.gz", "wb") outf.write(inf.read()) outf.close() inf.close() There's no problem with the file being overwritten, the name given to gzip.open() is completely independent of the name...
c#,compression,out-of-memory,sharpziplib
You're trying to create a buffer as big as the file. Instead, make the buffer a fixed size, read some bytes into it, and write the number of read bytes into the zip file. Here's your code with a buffer of 4096 bytes (and some cleanup): public static void CompressFiles(List<string>...
It depends entirely upon how large the arrays are and how much they would benefit from compression - neither of which you have disclosed. For example, if they were 50k and could be compressed to 40k, that difference would be unlikely to be perceived. If they were 1MB and could...
javascript,node.js,compression,gulp,webpack
I found a solution. We can use normal version of webpack (not just gulp-webpack) to provide plugin include capability: var gulpWebpack = require('gulp-webpack'), webpack = require('webpack'); gulp.task('webpack', function() { gulp.src('webpack-init.js') .pipe(gulpWebpack({ output: { filename: 'bundle.js', }, plugins: [new webpack.optimize.UglifyJsPlugin()], }, webpack)) .pipe(gulp.dest('./client/js')); }); ...
javascript,c#,compression,decompression
I don't know how well Javascript itself (through libraries) can use compressed data, but it doesn't need to. When you GZip the data server side and set the "Content-Encoding" header value to "gzip", then the browser will decompress it before handing it over to your js code. So you have...
This is my code (not an optimal solution) to solve the problem with comments....Hope it helps for your own code and program !! #include <stdio.h> #include <string.h> int main(int argc, char *argv[]) { /* file is a pointer to the file input where to read the line */ /* fileto...
Check this out may be helpful (Not the complete answer ) ZipParameters zp = new ZipParameters(); zp.setFileNameInZip("sample.zip"); System.out.println(zp.getCompressionMethod()); System.out.println(Zip4jConstants.COMP_DEFLATE); OutPut: 8 8 ...
algorithm,language-agnostic,binary,compression,storage
The information-theoretic minimum for storing 100 different values is log2100, which is about 6.644. In other words, the possible compression from 7 bits is a hair more than 5%. (log2100 / 7 is 94.91%.) If these pairs are simply for temporary storage during the algorithm, then it's almost certainly not...
ios,objective-c,video,compression,avassetwriter
You need a PhD to work with AVAssetWriter - it's non-trivial: https://developer.apple.com/library/mac/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/05_Export.html#//apple_ref/doc/uid/TP40010188-CH9-SW1 There's an amazing library for doing exactly what you want which is just an AVAssetExportSession drop-in replacement with more crucial features like changing the bit rate: https://github.com/rs/SDAVAssetExportSession Here's how to use it: -(void)imagePickerController:(UIImagePickerController *)picker...
Actually, in real world measurements a higher compression level yields lower decompression times (which might be primarily caused by the fact that you need to handle less permanent storage and less RAM access). Since, actually, most things that happen at a client with the data are rather expensive compared to...
asp.net,vbscript,compression,hex
The function h2s needs to break the input string into reasonable parts. As done here and in your code, Split() - on space - is used. In that case you can go without the zero padding Right("0000" & Hex(AscW(a(i))+AscW(Mid(k,y,1))), 4) in s2h. Alternatively, you can keep the padding and use...
java,arrays,for-loop,compression
You are close to the right answer. I suggest you step through your code in your debugger to see what you are doing. Since it is hard to give you a hint without giving you the answer... public static int[] compression(int[] array){ int j = 0; // iterate over all...
format,compression,textures,opengl-3,opengl-4
Seems you already did a deep research about compression formats available. Here goes my 2cents contribution: Most compressed formats are lossy. I guess only GL_OES_compressed_paletted_texture are not lossy, but it only supports 16 or 256 colors and there are some concerns about performance. It's supported by Adreno GPUs, as well...
With Reader you provide byte by byte access to the data you want to compress. So with std::vector<short> it would look like this. struct reader : public libzpaq::Reader { reader(const std::vector<short>& v) : m_v(v), m_offset(0) { } int get() { if (m_offset < m_v.size() * sizeof (short)) { return *((char*)...
Reversibly encoding the hash doesn't impact collision rate... Unless your encoding causes some loss of data (then it isn't reversible any more). Base64 and other binary-to-text encoding schemes are all reversible. Your first output is the hexadecimal (or base16) representation, which is 50% efficient. Base64 achieves 75% efficiency, meaning it...
c++,visual-studio-2013,compression,dicom,dcmtk
I solve the problem adding the tags DCM_BitsAllocated and DCM_PlanarConfiguration. This are the tags that are missed. I hope that is useful for someone.
You are passing in several separate arguments to the open() function: with open(first_answer, ".", second_answer, 'rb') as in_file: The 3rd positional argument to the open() function is the buffer argument which always must be an integer if specified. You need to concatenate those strings with + or use string formatting...
The code you've written is causing the ZipArchive class to write a whole new archive at the end of your previous one, which of course corrupts the file. The way to do what you want is to copy the original archive to a new file as you create it, and...
c,binary,compression,decompression
Nice try, with just few corrections. You have to store a mark that will give you a clue how long was original value so you can decompress it back. Sure thing, there is nothing new under the moon so you can reuse such thing as VarInt: https://developers.google.com/protocol-buffers/docs/encoding#varints It's wide-spread practice...