Menu
  • HOME
  • TAGS

scala filter by type

Tag: scala,generics,runtime-type

I have read TypeTag related article, but I am unable to realize filter a collection by elements type.

Example:

trait A
class B extends A
class C extends A

val v = Vector(new B,new C)
v filter ( _.isInstanceOf[B] )

The code above works fine. However I want to extract filter out of v. E.g.

def filter[T,T2](data:Traversable[T2]) = (data filter (  _.isInstanceOf[T])).asInstanceOf[Traversable[T]]

//Then filter v by
filter[B,A](v)

In this case I get warning abstract type T is unchecked since it is eliminated by erasure. I tried to use TypeTag, but it seems not easy to get Type on runtime.

Is there any elegant solution to realize function filter? Any solution via scala macro is also acceptable.

Best How To :

You need to provide a ClassTag, not a TypeTag, and use pattern matching. ClassTags work well with pattern matching. You can even use the collect method to perform the filter and map together:

def filter[T, T2](data: Traversable[T2])(implicit ev: ClassTag[T]) = data collect {
    case t: T => t
}

For example:

val data = Seq(new B, new B, new C, new B)
filter[B, A](data) //Traversable[B] with length 3
filter[C, A](data) //Traversable[C] with length 1

One problem with this is that it might not work as expected with nested generic types.

The collect method takes in a parameter of type PartialFunction, representing a function that does not need to be defined on the entire domain. When using collect elements where the PartialFunction is undefined are filtered out, and elements that matched some case statement are mapped accordingly.

You can also use existential types and let the compiler deduce the type of the data parameter for a more concise syntax. You can also use context bounds:

def filter[T : ClassTag](data: Traversable[_]) = data collect { case t: T => t }
filter[B](data)

One problem with the methods here is that there is a significant difference between the native filter method you have: these methods always returns a Traversable while the native filter returns the best type it can. For example:

val data = Vector(new B, new B, new C, new B)
data filter { _.isInstanceOf[B] } //Vector[A]
data filter { _.isInstanceOf[B] } map { _.asInstanceOf[B] } //Vector[B]
data collect { case t: B => t } //Vector[B].  Note that if you know the type at the calling point, this is pretty concise and might not need a helper method at all

//As opposed to:
filter[B](data) //Traversable[B], not a Vector!

You can fix this by using the CanBuildFrom pattern using another implicit parameter. You can also use implicit classes to essentially add the method to the class (as opposed to calling the method in the static style shown above). This all adds up to a pretty complicated method, but I'll leave it here if you're interested in these enhancements:

implicit class RichTraversable[T2, Repr <: TraversableLike[T2, Repr], That](val trav: TraversableLike[T2, Repr]) extends AnyVal {
    def withType[T : ClassTag](implicit bf: CanBuildFrom[Repr, T, That]) = trav.collect {
        case t: T => t
    }
}

This would allow you to do:

data.withType[B] //Vector[B], as desired    

Future yielding with flatMap

scala

There's no reason to flatMap in the yield. It should be another line in the for-comprehension. for { a <- fa b <- fb c <- fc d <- f(a, b, c) } yield d I don't think it can get more concise than that....

Scala unapplySeq extractor syntax

scala,pattern-matching,scala-2.11

The equivalent non-infix version is: xs match { case List(x, _, _) => "yes" case _ => "no" } Scala specification says: An infix operation pattern p;op;q is a shorthand for the constructor or extractor pattern op(p,q). The precedence and associativity of operators in patterns is the same as in...

Generic method only compiles with one argument

java,generics

The problem is that when you call addShape(someColl, new Circle()); there are two different definitions of T ? extends Shape from Collection<? extends Shape> someColl Circle from the second parameter The other problem with that call is that T needs to be a concrete type for the second parameter, i.e....

Convert RDD[Map[String,Double]] to RDD[(String,Double)]

scala,apache-spark,rdd

You can call flatMap with the identity function to 'flatten' the structure of your RDD. rdd.flatMap(identity) ...

Access key from mapValues or flatMapValues?

scala,apache-spark

In this case you can use mapPartitions with the preservesPartitioning attribute. x.map((it => it.map { case (k,rr) => (k, someFun(rr, k)) }), preservesPartitioning = true) You just have to make sure you are not changing the partitioning, i.e. don't change the key....

Preventing a class instantiation in Scala using Factory Pattern [duplicate]

scala,factory-pattern

The conventional way to write a factory in Scala is to define an apply method on the companion object. Here's an example using Either (because null is never/rarely used in Scala, and exceptions are ugly): class A private (n: Int) { override def toString = s"A($n)" } object A {...

Ninject generic type xml binding

c#,xml,generics,ninject

You want to bind open generic types, so this type definition should do the trick: <bind service="Base.IJsonProvider`1, Base" to="Base.JsonProvider`1, Base" name ="Config"/> ...

Scala running issue on eclipse

eclipse,scala

to run as scala application, you need to create Scala App and not class In eclipse, package explorer select project/src/package right click new>scala app inform Name e.g. Test and click "finish" select Test.scala right click "run as Scala Application" see results in console window....

Register return Type

c#,generics,return-type

If you know the type at startup, you could just derive the class: public class UserLogin : GenericLogin<ABC01_REGISTERED_USER> { } Then use that class all along. Else, you have to supply the type name every time, since else it can't know that you want to use that type every time....

Scala first program issue

scala,recursion,case,frequency

Cons operator (::) is an infix operator so if you want to get a type of List[T] and not List[List[T]] then you should write freq(c, y.filter(_ == c),(count(c,y),c)) :: list) ...

Difficulty with SBT

scala,sbt

The %% in the dependency automatically appends a _2.XX scala version to your artifact id. It makes scala dependencies easier to manage, but you can't use it with java dependencies like apache httpcomponents. Instead just use %: "org.apache.httpcomponents" % "httpclient" % "4.5" ...

How to code a generic Swift class which stores a Generator of the same type

swift,generics

You have to use the generator type as the type placeholder G, and refer to its element type as G.Element: class MyGenericClass<G : GeneratorType> { var source : G var itemsProcessed : [ G.Element ] = [] init(source: G) { self.source = source } func getValue() -> G.Element? { let...

Is there any scala library that treat tuples as monads

scala,tuples,monads

Yep, Scalaz provides monad instances for tuples (up to Tuple8): import scalaz.std.anyVal._, scalaz.std.tuple._, scalaz.syntax.monad._ scala> type IntTuple[A] = (Int, A) defined type alias IntTuple scala> pair >>= (a => (a+1).point[IntTuple]) res0: (Int, String) = (2,as1) scala> for (p <- pair) yield (p + 1) res1: (Int, String) = (2,as1) (Note...

Solving maze with Backtracking

scala,backtracking,maze

I'm only going to comment on findStart for now. There are two things wrong with findStart: findStart is recursively called on every adjacent cell. Unfortunately, the neighbouring cell of any neighbour is the cell itself. The function never checks if you can actually walk on a given cell (I assume...

Type to impose required constrains on a double

scala,implicit-conversion

This enabled basic run time checks: trait RangeBound type Probability = Double with RangeBound implicit def makeProb(p: Double): Probability = { assert (p >= 0.0 && p <= 1.0) p.asInstanceOf[Probability] } implicit val probabilityOrdering = Ordering.Double.asInstanceOf[Ordering[Probability]] ...

Cannot invoke method with argument list of type KeyType in Swift

swift,generics

I think that best solution for your case would be changing class declaration to: class EventDispatcher<U: EventDispatcherProtocol> { typealias KeyType = U.T And it will also simplify creation of the EventDispatcher with skipping the redundant type declarations: var dispatcher = EventDispatcher<CustomListener<CustomEvent>>() EDIT: Since the code was altered multiple times while...

implicit resolution for a function argument

scala,implicit,context-bound

You can overcome this by passing a function that calls mergesort to generalizedMergeSort. This call will capture the implicit Ordering: def mergesort[A: Ordering](as: List[A]): List[A] = { generalizedMergeSort(as, mergesort(_: List[A])) } mergesort(_: List[A]) is a closure function of type List[A] => List[A], which calls mergesort with its argument, and the...

Scodec: Coproducts could not find implicit value for parameter auto: scodec.codecs.CoproductBuilderAuto

scala,scodec

The code that is there now needs two minor changes: The Message trait must be sealed, or otherwise, Shapeless will not provide a Generic.Aux[Message, SomeCoproduct] instance. The call to Codec.coproduct[Message] must be after all the subtypes are defined. Moving the companion to the end of the file is sufficient. With...

refer to scala function by name?

scala

yyy is not a function, it's a method. You have to either convert it to a function using η-expansion yyy _ or use a function in the first place val yyy = (c: Char) => c.toUpper // or val yyy: Char => Char = c => c.toUpper // or val...

Scala - Option Type Var Manipulation

scala,scala-option

var balance = Some(0) is inferred to be of type Some[Int], when you need to tell this explicitly that it's of type Option[Int]: var balance: Option[Int] = Some(0) Then balance will be able to take in either Some(0) or None. By the way, it's sometimes a good practice to always...

Scala slf4j dynamic file name

scala,logging,slf4j

The slf4j library is really an interface to some underlying logging implementation. You would have log4j, logback or some other logging implementation do the heavy lifting, with an adapter jar, as explained in the slf4j documentation. You would then provide the details in the properties file for log4j for instance,...

My Scala program won't print anything

scala

So, lots of problems. All that stuff you are doing? It's getting done in the constructor of Book, and redone for every instance. Your main method? That's gets compiled to instance method of Book, not a static method, so it does not serve an an entry point for an executable...

How to instantiate lexical.Scanner in a JavaTokenParsers class?

scala,parsing,lexical-scanner

The JavaTokenParsers does not implement the Scanners trait. So you would need to extends also from this trait (or a trait that extends it) in order to have access to this class. Unless your expr parser accepts the Reader as a parameter (not from its apply method), you'd need to...

Spray microservice assembly deduplicate

scala,sbt,akka,spray,microservices

The issue as it seems transitive dependency of the dependency is resulting with two different versions of metrics-core. The best thing to do would be to used the right library dependency so that you end up with a single version of this library. Please use https://github.com/jrudolph/sbt-dependency-graph , if it is...

Spray route get response from child actor

scala,akka,spray

The first problem with your code is that you need to forward from the master actor to the child so that the sender is properly propagated and available for the child to respond to. So change this (in RedisActor): summaryActor ! msg To: summaryActor forward msg That's the primary issue....

How to unmarshall akka http request entity as string?

json,scala,akka-http

Your code should be okay provided you have the right implicits in scope. If you have an implicit FlowMaterializer in scope then things should work as expected as this code that compiles shows: import akka.http.scaladsl.server.Route import akka.actor.ActorSystem import akka.stream.ActorFlowMaterializer import akka.http.scaladsl.model.StatusCodes._ import akka.http.scaladsl.server.Directives._ import akka.stream.FlowMaterializer implicit val system = ActorSystem("test")...

Scala (Slick) HList splitting to case classes

scala,slick

Using the tuple functionality in shapeless you could do: import shapeless._ import syntax.std.tuple._ case class Foo(a: Int, b: String) val hlist = 1 :: "a" :: 2 :: "b" :: HNil Foo.tupled(hlist.take(2).tupled) ...

Scala string replacement of entire words that comply with a pattern

string,scala,scala-collections,scala-string

You can use the \bth\w* pattern to look for words that begin with th followed by other word characters, and then replace all matches with "123" scala> "this is the example, that we think of, anne hathaway".replaceAll("\\bth\\w*", "123") res0: String = 123 is 123 example, 123 we 123 of, anne...

How to effectively get indices of 1s for given binary string using Scala?

scala,functional-programming,higher-order-functions

You can use a filter and then map to get the index : scala> val s = "10010010" s: String = 10010010 scala> s.zipWithIndex.withFilter(_._1 == '1').map(_._2) res0: scala.collection.immutable.IndexedSeq[Int] = Vector(0, 3, 6) Note: I'm using withFilter and not filter to avoid creating a temporary collection. Or you can use collect,...

ZipList with Scalaz

list,scala,scalaz,applicative

pure for zip lists repeats the value forever, so it's not possible to define a zippy applicative instance for Scala's List (or for anything like lists). Scalaz does provide a Zip tag for Stream and the appropriate zippy applicative instance, but as far as I know it's still pretty broken....

Collapse similar case statements in Scala

scala,functional-programming,pattern-matching

You can use a custom extractor to abstract the matching part away from the logic part: object Leafed { def unapply(tree: Tree) = tree match { case Node(Leaf(_, _), parent, qux) => Some((parent, qux)) case Node(parent, Leaf(_, _), qux) => Some((parent, qux)) case _ => None } } And then...

How to use the Akka ask pattern without blocking

scala,asynchronous,akka,future

You don't want to block, by waiting on the response of the actor, so you are using Future correctly. The code in the onComplete function is executed, when your actor responds with the list. And since you don't want to block and handle it asynchronously, your last println statement is...

Like clause not working with int column in slick

scala,slick,slick-2.0

Can you post your Status class definitation .If code is type column[Int] your code should be giving error as like works on column[string]. The below snippet works for doing a like on integer field. class Coffees(tag: Tag) extends Table[(String, Int)](tag, "COFFEES") { def name = column[String]("NAME") def status = column[Int]("STATUS")...

Is this definition of a tail recursive fibonacci function tail-recursive?

scala,f#,functional-programming,tail-recursion,continuation-passing

The second call to go on line 4 is not in tail position, it is wrapped inside an anonymous function. (It is in tail position for that function, but not for go itself.) For continuation passing style you need Proper Tail Calls, which Scala unfortunately doesn't have. (In order to...

Implicit Generic.Aux missing on conversion from Shapeless HList to case class

scala,shapeless,type-level-computation

You're very close. The problem is that Scala isn't going to propagate implicit requirements up the call chain automatically for you. If you need a Generic[A, T] instance to call convert, then you'll have to make sure that one's in scope every time you call convert convert. If A and...

Implementing map on a tree using fold

scala,haskell

Try to write your last line as def map(tree:Tree[Int])(f:Int=>Int) : Tree[Int] = fold(tree , EmptyTree:Tree[Int])((l,x,r) => Node(f(x),l,r)) Scala's type inference is very limited compared to haskell, in this case it tries to infere type of fold from it's arguments left to right, and incorectly decides that result type of fold...

How to generalize the round methods

scala

You could use the Numeric type class def round[T](input: T, scale: Int, f: BigDecimal => T)(implicit n: Numeric[T]): T = { f(BigDecimal(n.toDouble(input)).setScale(scale, RoundingMode.HALF_UP)) } Which can be used as: round(5.525, 2, _.doubleValue) res0: Double = 5.53 round(123456789L, -5, _.longValue) res1: Long = 123500000 Another way might be to create a...

How to define a Regex in StandardTokenParsers to identify path?

regex,scala,parsing,lexical-analysis

In a double quoted string backslash is an escape character. If you mean to use the literal backslash in a double quotes string you must escape it, thus "\d" should be "\\d". Furthermore you do not need to escape the regex dot within a character class, since dot has no...

SCALA: change the separator in Array

arrays,string,scala,delimiter

Your question is unclear, but I'll take a shot. To go from: val x = Array("a","x,y","b") to "a:x,y:b" You can use mkString: x.mkString(":") ...

Passing a function foreach key of an Array

scala,apache-spark,scala-collections,spark-graphx

You're looking for the groupBy function followed by mapValues to process each group. pairs groupBy {_._1} mapValues { groupOfPairs => doSomething(groupOfPairs) } ...

Retrieving TriangleCount

scala,apache-spark,spark-graphx

triangleCount counts number of triangles per vertex and returns Graph[Int,Int], so you have to extract vertices: scala> graph.triangleCount().vertices.collect() res0: Array[(org.apache.spark.graphx.VertexId, Int)] = Array((1,1), (3,1), (2,1)) ...

Play Framework Form Error Handling

scala,playframework,playframework-2.3,playframework-2.4

Have a look at play documentation: Writing your own field constructor. You can check on errors with @if(elements.hasErrors) within the template of your custom field constructor. <div class="input-with-label text-left @if(elements.hasErrors){field-error}"> ... Edit: You can pass the error state of your field via the args parameter to your input. From the...

Creating a generic / abstract “DBContext” Class for shared functionality among different DBs

c#,database,generics,inheritance,abstract-class

The key here is to step back and think about the problem from another angle. You are duplicating lots of code because you are creating instances of the database and command classes within the method. So inject them instead: public class SomeDBClass { static DataTable exec_DT(DBConnection conn, DBCommand cmd) {...

Scala rep separator for specific area of text

scala,parser-combinators

I guess you are using the RegexParsers (just note that it skips white spaces by default). I'm assuming that it ends with "\n\n--open--" instead (if you can change that otherwise I'll show you how to modify the repsep parser). With this change we see that the text has the following...

PlayFramework: value as is not a member of Array[Byte]

scala,playframework

You are calling the as method on the wrong object. It should look as follows: Ok(bytOfImage).as("image/jpg") ...

Generic TypeCode Type Checking?

c#,generics

You can make your code generic with something like this: var size = Marshal.SizeOf(typeof(T)); var subBuffer = new byte[size]; Array.Copy(Buff, Peek, subBuffer, 0, size); var handle = GCHandle.Alloc(subBuffer, GCHandleType.Pinned); var ptr = handle.ToIntPtr(); var val = (T)Marshal.PtrToStructure(ptr, typeof(T)); ptr.Free(); Peek += size; Peek = ( Peek + ( Align -...

Operand order in Scala List.prepend (::)

list,scala,operators

Any operator with a : on its right side has its operands flipped. There's other operators that make use of this to (can't think of any examples off the top of my head though).

How can I pass N number of generic arguments to a typedef function pointer?

c++,generics,typedef

In C++11, you can do something like this template<typename T> using fun_ptr = void (*)(T); And for the second case, template<typename... T> using fun_ptr = void (*)(T ...); ...

Zipping two arrays together with index in Scala?

arrays,scala,zip

Simply do: array1.zip(array2).zipWithIndex.map { case ((a, b), i) => (a, b, i) } ...

How to reuse MappedColumnType in Table classes?

scala,playframework,slick

You could, for example, move it to a trait like this: trait DateColumnMapper extends HasDatabaseConfig[JdbcProfile] { protected val dbConfig: DatabaseConfig[JdbcProfile] import driver.api._ implicit val dateColumnType = MappedColumnType.base[Date, Long]( d => d.getTime, d => new Date(d) ) } Then you can include this trait in whatever DAO or db component you...