The Northeast Scala Symposium is a Scala-focused community gathering.
NE Scala offers a mix of speaker-oriented conference presentations with unconference-style sessions and discussions. (And coffee. Lots of coffee.) All presenters are attendees, and all attendees select presenters.
This year, thanks to Lightbend, the Day 1 talks will be streamed live, over the Internet, for up to 100 remote watchers. If you can't join us in person, you can join us virtually.
The talks will be streamed via WebEx. Go here for access to the stream.
There's a 100-viewer limit on the WebEx stream.
Scalaz-Stream (soon to be called "FS2" or "Functional Streams for Scala") is a library that lets us write compositional, modular, safe, and efficient asynchronous I/O using an expressive purely functional API. In this hands-on talk we'll see how to get started with the library, cook up a few streams, and look at ways of combining them into larger systems.
Traditionally, Apache Spark jobs have been written using Resilient Distributed Datasets (RDDs), a Scala Collections-like API. RDDs are type-safe, but they can be problematic: It's easy to write a suboptimal job, and RDDs are significantly slower in Python than in Scala. DataFrames address some of these problems, and they're much faster, even in Scala; but, DataFrames aren't type-safe, and they're arguably less flexible.
Enter Datasets, a type-safe, object-oriented programming interface that works with the DataFrames API, provide some of the benefits of RDDs, and can be optimized via the Catalyst optimizer.
This talk will briefly recap RDDs and DataFrames, introduce the Datasets API, and then, through a live demonstration, compare the performance of all three against the same non-trivial data source.
Since their introduction in 2.10, Macros have veritably taken Scala by storm. They pop up everywhere, often providing elegant syntax extensions for Scala libraries; note the prevalence of "sql" string interpolation macros, for example.
Many developers, however, are scared and confused by Macros – rightly so. Many examples of Macros require deep knowledge of the Scala AST, internals and oddities. But hope exists: newer features like quasiquotes make it ridiculously easy to write powerful Macros with code templating.
In this talk, we’ll demystify just how Macros work and are constructed, walking through many of the different "types" of Macros that Scala supports. A final focus will be given to the use of the newer Quasiquotes and Annotations features as a way of quickly constructing powerful new code features in your own Scala code.
Logic programming is perhaps underappreciated compared to its declarative sibling, Functional Programming. In this lightning talk, we will examine a logic programming DSL built on top of microKanren in Scala. We will demonstrate it's expressive power by using it to help us find the solution(s) to a particularly mind-bending variant of a classic puzzle. The goal is to show that we can solve problems relationally yet write what feels like idiomatic, functional Scala.
Logging is more complicated than most people realize, and when they do, it's too late to do anything about it until "next time." Learn how to use a writer monad as a logging mechanism in your code. More importantly though, learn why you would want to do this, and what the benefits are.
There is a price to pay in adopting this approach, but it's more "functional", addresses existing challenges, and enables new, exciting capabilities.
The demand for stream processing is increasing a lot these day. Immense amounts of data has to be processed fast from a rapidly growing set of disparate data sources. This pushes the limits of traditional data processing infrastructures. These stream-based applications include trading, social networks, Internet of things, system monitoring, and many other examples.
A number of powerful, easy-to-use open source platforms have emerged to address this. But a same problem can be solved differently, various but sometimes overlapping use-cases can be targeted or different vocabularies for similar concepts can be used. This may lead to confusion, longer development time or costly wrong decisions.
In this talk we are going to discuss various state of the art open-source distributed streaming frameworks, their similarities and differences, implementation trade-offs and their intended use-cases. Apart of that, I’m going to speak about Fast Data, theory of streaming, framework evaluation and so on. My goal is to provide comprehensive overview about modern streaming frameworks and to help fellow developers with picking the best possible for their particular use-case.
In ScalaTest 3.0's new async testing styles, tests have a result type of Future[Assertion]. Instead of blocking until a future completes, then performing assertions on the result, you map assertions onto the future and return the resulting
Future[Assertion] to ScalaTest. The test will complete asynchronously when the
Future[Assertion] completes. This way of testing requires a different mindset. In this talk I'll walk you through the design, show you how to use it, and suggest best practices for async testing on both the JVM and Scala.js.
This talk demos a vision that started to leak into reality as small prototypes. A vision of a world where writing and maintaining build scripts is fun and not black magic. A world without scoped-multi-config-multi-project builds. With fast dependency resolution. Intuitive and statically checked build configuration. Where large builds are composed from small self-contained and easy to understand builds. Where you only need to learn a small Scala library nothing more. We’ll talk about concepts, implementation specifics and do a live demo. This is not a complete tool, but a collection of crucial pieces and insights that may one day contribute to one.
Among the most important properties of software is 'modularity', the notion that you build complex modules from simpler ones. It is common practice to adhere to this concept in the context of logical components, such as separating an application into input, transformation, and output. Once we drill down into lower level details however, it can get a bit finicky to enforce the same kind of modularity. In this talk we will explore how we can build complex programs from simple building blocks regardless of the level we are working at, be it functions, modules, or (embedded) programs. Particularly, we will see how an expressive type system can be leveraged to keep our modularity in check as we build more and more complex modules. Examples will be given in Scala but techniques discussed are generally language agnostic.
Your app is a hit. Everyone's using your service. You've instrumented your website to collect real time usage data and you've set up a data processing pipeline to work with large amounts of data in real time, but how do you bridge the gap between web page and data pipeline? With websockets, Kafka and Akka Streams, of course!
This talk covers using Akka Streams and Akka HTTP to, in less than 100 lines of code, build a server that consumes streams of events via websocket and publishes them to a Kafka topic, after which you can use the data analysis framework of your choice to analyze them to your heart's content.
Scala's type system provides generics in two ways: type members and type parameters, and most advanced Scala developers will use both in their code at different times. But how do we choose between them? Usually one or the other will feel "right" for a particular use case, and—if not—we can usually refactor and take the alternative approach. But could we learn to be better at making the "right" choice first time? I'll explore the most important considerations when choosing between them, and look at whether they are completely interchangeable in all cases, or whether certain scenarios force us down one path or the other.
Sponsored by Chariot Solutions, the drink-up will be held downstairs, at JG Domestic, from 17:30 to 19:30. Beer, wine, and soft drinks are free (yes, "free" as in "beer"). Other drinks are available for purchase. Light appetizers and happy hour munchies will also be served. Bring your name tag.
Nobody likes a jerk, so show respect for those around you.
NE Scala is dedicated to providing a harassment-free experience for everyone, regardless of gender, gender identity and expression, sexual orientation, disability, physical appearance, body size, race, or religion (or lack thereof). We do not tolerate harassment of participants in any form.
All communication should be appropriate for a technical audience, including people of many different backgrounds. Sexual language, innuendo, and imagery is not appropriate for any symposium venue, including talks.
Participants violating these rules may be asked to leave without a refund, at the sole discretion of the organizers.
Since this is a gathering of static typists, offenders will be caught at compile time.
This year, we're excited to tell you about some other events that are associated with NE Scala 2016.
The summit is open to all, not just current contributors to and users of the Typelevel projects, and the Typelevel organizers are especially keen to encourage participation from people who are new to them.
While many of the Typelevel projects use somewhat "advanced" Scala, they are a lot more approachable than many people think, and a major part of Typelevel's mission is to make the ideas they embody much more widely accessible.
So, if you're interested in types and pure functional programming, want to make those ideas commonplace and are willing to abide by the Typelevel code of conduct, then this is the place for you and we'd love to see you there.
NewCircle is offering a public Intro to Spark class, to be held in Philadelphia March 15—17. NE Scala attendees get a $200 discount on the class fee. (The promotion code will be emailed to all those whose RSVP, a week or two after RSVPs open.) The class will be held in Suite 1700 of the American Executive Center at 1515 Market Street, Philadelphia, PA 19102. Full details on the class, including registration information, are on the NewCircle event page.