Menu
  • HOME
  • TAGS

Is there a way to tell that persistence backend is unreachable when working with persistent actor?

Tag: scala,akka,akka-persistence

I am working with a persistent actor in Scala using EventStore as a backend. Testing is specs2-based. During initialization of the spec class, inside the constructor of the other class that is being instantiated, I ask my actor for something and, if EventStore is not running, get

Could not create an instance of 
com.optrak.opkakka.authentication.AuthenticationManagementSpec

caused by

akka.pattern.AskTimeoutException: Ask timed out on 
[Actor[akka://com-optrak-opkakka-authentication-AuthenticationManagementSpec/user/$b/AuthenticationModel#1565142060]] after [2000 ms]

Where AuthenticationModel is my actor's name.

The questions are,

first, why doesn't my actor respond to ask? The asked command is not persisted and the actor hasn't received any persisted commands to change its state at this point, because it has just been created.

second, how can I detect that the backend is not running beforehand to issue a warning to the user?

Best How To :

Using pointers from ktoso (thanks!) and a small test project, I found my own way. I handle RecoveryFailure in my persistent actor as suggested by some error messages by throwing onward a new IllegalStateException with a suggestion to check if EventStore is running. Then the supervisor gets a shot at handling this using its custom strategy:

override def supervisorStrategy = OneForOneStrategy(maxNrOfRetries = 3){
  case _: IllegalStateException => Restart
  case t =>
    super.supervisorStrategy.decider.applyOrElse(t, (_: Any) => Escalate)
} 

After my persistent actor is restarted three times, everything terminates (by the way, what REALLY happens under the hood here?) and I have all the stack traces and error messages in the log file.

Future yielding with flatMap

scala

There's no reason to flatMap in the yield. It should be another line in the for-comprehension. for { a <- fa b <- fb c <- fc d <- f(a, b, c) } yield d I don't think it can get more concise than that....

Retrieving TriangleCount

scala,apache-spark,spark-graphx

triangleCount counts number of triangles per vertex and returns Graph[Int,Int], so you have to extract vertices: scala> graph.triangleCount().vertices.collect() res0: Array[(org.apache.spark.graphx.VertexId, Int)] = Array((1,1), (3,1), (2,1)) ...

Scala first program issue

scala,recursion,case,frequency

Cons operator (::) is an infix operator so if you want to get a type of List[T] and not List[List[T]] then you should write freq(c, y.filter(_ == c),(count(c,y),c)) :: list) ...

How to define a Regex in StandardTokenParsers to identify path?

regex,scala,parsing,lexical-analysis

In a double quoted string backslash is an escape character. If you mean to use the literal backslash in a double quotes string you must escape it, thus "\d" should be "\\d". Furthermore you do not need to escape the regex dot within a character class, since dot has no...

Scala - Option Type Var Manipulation

scala,scala-option

var balance = Some(0) is inferred to be of type Some[Int], when you need to tell this explicitly that it's of type Option[Int]: var balance: Option[Int] = Some(0) Then balance will be able to take in either Some(0) or None. By the way, it's sometimes a good practice to always...

Operand order in Scala List.prepend (::)

list,scala,operators

Any operator with a : on its right side has its operands flipped. There's other operators that make use of this to (can't think of any examples off the top of my head though).

How to generalize the round methods

scala

You could use the Numeric type class def round[T](input: T, scale: Int, f: BigDecimal => T)(implicit n: Numeric[T]): T = { f(BigDecimal(n.toDouble(input)).setScale(scale, RoundingMode.HALF_UP)) } Which can be used as: round(5.525, 2, _.doubleValue) res0: Double = 5.53 round(123456789L, -5, _.longValue) res1: Long = 123500000 Another way might be to create a...

Implementing map on a tree using fold

scala,haskell

Try to write your last line as def map(tree:Tree[Int])(f:Int=>Int) : Tree[Int] = fold(tree , EmptyTree:Tree[Int])((l,x,r) => Node(f(x),l,r)) Scala's type inference is very limited compared to haskell, in this case it tries to infere type of fold from it's arguments left to right, and incorectly decides that result type of fold...

IntelliJ - use imported modules as dependencies like maven projects in Eclipse

eclipse,scala,maven,intellij-idea,sbt

It should work out of box for dependencies, which are imported to the project as modules, no additional settings needed. At least for Java. Just do not run a Maven goal, that would use dependencies from the repository. ...

How to reuse MappedColumnType in Table classes?

scala,playframework,slick

You could, for example, move it to a trait like this: trait DateColumnMapper extends HasDatabaseConfig[JdbcProfile] { protected val dbConfig: DatabaseConfig[JdbcProfile] import driver.api._ implicit val dateColumnType = MappedColumnType.base[Date, Long]( d => d.getTime, d => new Date(d) ) } Then you can include this trait in whatever DAO or db component you...

Like clause not working with int column in slick

scala,slick,slick-2.0

Can you post your Status class definitation .If code is type column[Int] your code should be giving error as like works on column[string]. The below snippet works for doing a like on integer field. class Coffees(tag: Tag) extends Table[(String, Int)](tag, "COFFEES") { def name = column[String]("NAME") def status = column[Int]("STATUS")...

Preventing a class instantiation in Scala using Factory Pattern [duplicate]

scala,factory-pattern

The conventional way to write a factory in Scala is to define an apply method on the companion object. Here's an example using Either (because null is never/rarely used in Scala, and exceptions are ugly): class A private (n: Int) { override def toString = s"A($n)" } object A {...

PlayFramework: value as is not a member of Array[Byte]

scala,playframework

You are calling the as method on the wrong object. It should look as follows: Ok(bytOfImage).as("image/jpg") ...

How to use the Akka ask pattern without blocking

scala,asynchronous,akka,future

You don't want to block, by waiting on the response of the actor, so you are using Future correctly. The code in the onComplete function is executed, when your actor responds with the list. And since you don't want to block and handle it asynchronously, your last println statement is...

My Scala program won't print anything

scala

So, lots of problems. All that stuff you are doing? It's getting done in the constructor of Book, and redone for every instance. Your main method? That's gets compiled to instance method of Book, not a static method, so it does not serve an an entry point for an executable...

Scala unapplySeq extractor syntax

scala,pattern-matching,scala-2.11

The equivalent non-infix version is: xs match { case List(x, _, _) => "yes" case _ => "no" } Scala specification says: An infix operation pattern p;op;q is a shorthand for the constructor or extractor pattern op(p,q). The precedence and associativity of operators in patterns is the same as in...

Difficulty with SBT

scala,sbt

The %% in the dependency automatically appends a _2.XX scala version to your artifact id. It makes scala dependencies easier to manage, but you can't use it with java dependencies like apache httpcomponents. Instead just use %: "org.apache.httpcomponents" % "httpclient" % "4.5" ...

Spray route get response from child actor

scala,akka,spray

The first problem with your code is that you need to forward from the master actor to the child so that the sender is properly propagated and available for the child to respond to. So change this (in RedisActor): summaryActor ! msg To: summaryActor forward msg That's the primary issue....

Is this definition of a tail recursive fibonacci function tail-recursive?

scala,f#,functional-programming,tail-recursion,continuation-passing

The second call to go on line 4 is not in tail position, it is wrapped inside an anonymous function. (It is in tail position for that function, but not for go itself.) For continuation passing style you need Proper Tail Calls, which Scala unfortunately doesn't have. (In order to...

Implicit Generic.Aux missing on conversion from Shapeless HList to case class

scala,shapeless,type-level-computation

You're very close. The problem is that Scala isn't going to propagate implicit requirements up the call chain automatically for you. If you need a Generic[A, T] instance to call convert, then you'll have to make sure that one's in scope every time you call convert convert. If A and...

SCALA: change the separator in Array

arrays,string,scala,delimiter

Your question is unclear, but I'll take a shot. To go from: val x = Array("a","x,y","b") to "a:x,y:b" You can use mkString: x.mkString(":") ...

How to unmarshall akka http request entity as string?

json,scala,akka-http

Your code should be okay provided you have the right implicits in scope. If you have an implicit FlowMaterializer in scope then things should work as expected as this code that compiles shows: import akka.http.scaladsl.server.Route import akka.actor.ActorSystem import akka.stream.ActorFlowMaterializer import akka.http.scaladsl.model.StatusCodes._ import akka.http.scaladsl.server.Directives._ import akka.stream.FlowMaterializer implicit val system = ActorSystem("test")...

Scala, how to set up a node class?

scala

Algebraic data types break encapsulation by exposing the internal representation of the type publicly. When you take a functional programming point of view with regards to your design, then mutable state is not something that is a concern normally. Therefore, exposing the internal representation is not really a big deal...

Scala (Slick) HList splitting to case classes

scala,slick

Using the tuple functionality in shapeless you could do: import shapeless._ import syntax.std.tuple._ case class Foo(a: Int, b: String) val hlist = 1 :: "a" :: 2 :: "b" :: HNil Foo.tupled(hlist.take(2).tupled) ...

Type to impose required constrains on a double

scala,implicit-conversion

This enabled basic run time checks: trait RangeBound type Probability = Double with RangeBound implicit def makeProb(p: Double): Probability = { assert (p >= 0.0 && p <= 1.0) p.asInstanceOf[Probability] } implicit val probabilityOrdering = Ordering.Double.asInstanceOf[Ordering[Probability]] ...

How to instantiate lexical.Scanner in a JavaTokenParsers class?

scala,parsing,lexical-scanner

The JavaTokenParsers does not implement the Scanners trait. So you would need to extends also from this trait (or a trait that extends it) in order to have access to this class. Unless your expr parser accepts the Reader as a parameter (not from its apply method), you'd need to...

Zipping two arrays together with index in Scala?

arrays,scala,zip

Simply do: array1.zip(array2).zipWithIndex.map { case ((a, b), i) => (a, b, i) } ...

Is there any scala library that treat tuples as monads

scala,tuples,monads

Yep, Scalaz provides monad instances for tuples (up to Tuple8): import scalaz.std.anyVal._, scalaz.std.tuple._, scalaz.syntax.monad._ scala> type IntTuple[A] = (Int, A) defined type alias IntTuple scala> pair >>= (a => (a+1).point[IntTuple]) res0: (Int, String) = (2,as1) scala> for (p <- pair) yield (p + 1) res1: (Int, String) = (2,as1) (Note...

Passing a function foreach key of an Array

scala,apache-spark,scala-collections,spark-graphx

You're looking for the groupBy function followed by mapValues to process each group. pairs groupBy {_._1} mapValues { groupOfPairs => doSomething(groupOfPairs) } ...

Scala rep separator for specific area of text

scala,parser-combinators

I guess you are using the RegexParsers (just note that it skips white spaces by default). I'm assuming that it ends with "\n\n--open--" instead (if you can change that otherwise I'll show you how to modify the repsep parser). With this change we see that the text has the following...

Collapse similar case statements in Scala

scala,functional-programming,pattern-matching

You can use a custom extractor to abstract the matching part away from the logic part: object Leafed { def unapply(tree: Tree) = tree match { case Node(Leaf(_, _), parent, qux) => Some((parent, qux)) case Node(parent, Leaf(_, _), qux) => Some((parent, qux)) case _ => None } } And then...

How to effectively get indices of 1s for given binary string using Scala?

scala,functional-programming,higher-order-functions

You can use a filter and then map to get the index : scala> val s = "10010010" s: String = 10010010 scala> s.zipWithIndex.withFilter(_._1 == '1').map(_._2) res0: scala.collection.immutable.IndexedSeq[Int] = Vector(0, 3, 6) Note: I'm using withFilter and not filter to avoid creating a temporary collection. Or you can use collect,...

Scala string replacement of entire words that comply with a pattern

string,scala,scala-collections,scala-string

You can use the \bth\w* pattern to look for words that begin with th followed by other word characters, and then replace all matches with "123" scala> "this is the example, that we think of, anne hathaway".replaceAll("\\bth\\w*", "123") res0: String = 123 is 123 example, 123 we 123 of, anne...

ZipList with Scalaz

list,scala,scalaz,applicative

pure for zip lists repeats the value forever, so it's not possible to define a zippy applicative instance for Scala's List (or for anything like lists). Scalaz does provide a Zip tag for Stream and the appropriate zippy applicative instance, but as far as I know it's still pretty broken....

Convert RDD[Map[String,Double]] to RDD[(String,Double)]

scala,apache-spark,rdd

You can call flatMap with the identity function to 'flatten' the structure of your RDD. rdd.flatMap(identity) ...

Providing implicit value for singletons in Play Json library

json,scala,playframework,scala-macros

For doing that I should define an implicit object like this: implicit object StatusFormat extends Format[Status] { def reads(json: JsValue) = json match { case JsString("Edited") => JsSuccess(Edited) case JsString("NotEdited") => JsSuccess(NotEdited) case _ => JsError("cannot parse it") } def writes(stat: Status) = JsString(stat.toString) } ...

refer to scala function by name?

scala

yyy is not a function, it's a method. You have to either convert it to a function using η-expansion yyy _ or use a function in the first place val yyy = (c: Char) => c.toUpper // or val yyy: Char => Char = c => c.toUpper // or val...

Bulkheading strategies for Akka actors

java,asynchronous,akka,blocking,future

If I understand this correctly, you kind of have two options here: you listen to a Future being completed or you do something with the result: If you want to listen, you can use some callback like final ExecutionContext ec = system.dispatcher(); future.onSuccess(new OnSuccess<String>() { public void onSuccess(String result) {...

Scodec: Coproducts could not find implicit value for parameter auto: scodec.codecs.CoproductBuilderAuto

scala,scodec

The code that is there now needs two minor changes: The Message trait must be sealed, or otherwise, Shapeless will not provide a Generic.Aux[Message, SomeCoproduct] instance. The call to Codec.coproduct[Message] must be after all the subtypes are defined. Moving the companion to the end of the file is sufficient. With...

Scala running issue on eclipse

eclipse,scala

to run as scala application, you need to create Scala App and not class In eclipse, package explorer select project/src/package right click new>scala app inform Name e.g. Test and click "finish" select Test.scala right click "run as Scala Application" see results in console window....

implicit resolution for a function argument

scala,implicit,context-bound

You can overcome this by passing a function that calls mergesort to generalizedMergeSort. This call will capture the implicit Ordering: def mergesort[A: Ordering](as: List[A]): List[A] = { generalizedMergeSort(as, mergesort(_: List[A])) } mergesort(_: List[A]) is a closure function of type List[A] => List[A], which calls mergesort with its argument, and the...

Scala slf4j dynamic file name

scala,logging,slf4j

The slf4j library is really an interface to some underlying logging implementation. You would have log4j, logback or some other logging implementation do the heavy lifting, with an adapter jar, as explained in the slf4j documentation. You would then provide the details in the properties file for log4j for instance,...

Solving maze with Backtracking

scala,backtracking,maze

I'm only going to comment on findStart for now. There are two things wrong with findStart: findStart is recursively called on every adjacent cell. Unfortunately, the neighbouring cell of any neighbour is the cell itself. The function never checks if you can actually walk on a given cell (I assume...

How to get notified when unfiltered Netty server actually gets shutdown?

scala,testing,netty,unfiltered

Easy answer: replace your Unfiltered Netty server with a HTTP4S Blaze server. var server: org.http4s.server.Server = null val go: Task[Server] = org.http4s.server.blaze.BlazeBuilder .bindHttp(mockServicePort) .mountService(mockService) .start before { server = go.run } after { server.shutdown.run } There's also an awaitShutdown that blocks until the server shuts down. But since shutdown is...

Spray microservice assembly deduplicate

scala,sbt,akka,spray,microservices

The issue as it seems transitive dependency of the dependency is resulting with two different versions of metrics-core. The best thing to do would be to used the right library dependency so that you end up with a single version of this library. Please use https://github.com/jrudolph/sbt-dependency-graph , if it is...

Access key from mapValues or flatMapValues?

scala,apache-spark

In this case you can use mapPartitions with the preservesPartitioning attribute. x.map((it => it.map { case (k,rr) => (k, someFun(rr, k)) }), preservesPartitioning = true) You just have to make sure you are not changing the partitioning, i.e. don't change the key....

Play Framework Form Error Handling

scala,playframework,playframework-2.3,playframework-2.4

Have a look at play documentation: Writing your own field constructor. You can check on errors with @if(elements.hasErrors) within the template of your custom field constructor. <div class="input-with-label text-left @if(elements.hasErrors){field-error}"> ... Edit: You can pass the error state of your field via the args parameter to your input. From the...