Exchanges SIP Vs Direct Feed Timestamps


NYSE July 8 2015 Trading system Outage

Spark 1.4.1 startup script debugged.

/home/hduser/spark-1.4.1-bin-hadoop2.6/bin/run-example
|_
    /home/hduser/spark-1.4.1-bin-hadoop2.6/bin/spark-submit
    |_
        /home/hduser/spark-1.4.1-bin-hadoop2.6/bin/spark-class
        
hduser@shaklubix1:~/spark-1.4.1-bin-hadoop2.6/bin$ ./run-example JavaWordCount /home/hduser/spark-1.4.1-bin-hadoop2.6/README.md
|_
    Running /home/hduser/spark-1.4.1-bin-hadoop2.6/bin/spark-submit
    shak debug: /home/hduser/spark-1.4.1-bin-hadoop2.6/bin/spark-class: SPARK_HOME=/home/hduser/spark-1.4.1-bin-hadoop2.6
    shak debug: /home/hduser/spark-1.4.1-bin-hadoop2.6/bin/spark-class: ASSEMBLY_DIR=/home/hduser/spark-1.4.1-bin-hadoop2.6/lib
    shak debug: /home/hduser/spark-1.4.1-bin-hadoop2.6/bin/spark-class: $@
    org.apache.spark.deploy.SparkSubmit
        --master
        local[*]
        --class
        org.apache.spark.examples.JavaWordCount
        /home/hduser/spark-1.4.1-bin-hadoop2.6/lib/spark-examples-1.4.1-hadoop2.6.0.jar
        /home/hduser/spark-1.4.1-bin-hadoop2.6/README.md
    shak debug: -----

    shak debug: /home/hduser/spark-1.4.1-bin-hadoop2.6/bin/spark-class: LAUNCH_CLASSPATH
        /home/hduser/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar

    shak debug: /home/hduser/spark-1.4.1-bin-hadoop2.6/bin/spark-class: ${CMD[@]}
        /usr/lib/jvm/java-7-oracle/bin/java
        -cp
        /home/hduser/spark-1.4.1-bin-hadoop2.6/conf/
        /home/hduser/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar
        /home/hduser/spark-1.4.1-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar
        /home/hduser/spark-1.4.1-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar
        /home/hduser/spark-1.4.1-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar
        -Xms512m
        -Xmx512m
        -XX:MaxPermSize=256m
        org.apache.spark.deploy.SparkSubmit
        --master
        local[*]
        --class
        org.apache.spark.examples.JavaWordCount
        /home/hduser/spark-1.4.1-bin-hadoop2.6/lib/spark-examples-1.4.1-hadoop2.6.0.jar
        /home/hduser/spark-1.4.1-bin-hadoop2.6/README.md
        shak debug: -----

    Running /home/hduser/spark-1.4.1-bin-hadoop2.6/bin/spark-class
    exec /usr/lib/jvm/java-7-oracle/bin/java -cp /home/hduser/spark-1.4.1-bin-hadoop2.6/conf/:/home/hduser/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar:/home/hduser/spark-1.4.1-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar:/home/hduser/spark-1.4.1-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar:/home/hduser/spark-1.4.1-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar -Xms512m -Xmx512m -XX:MaxPermSize=256m org.apache.spark.deploy.SparkSubmit --master local[*] --class org.apache.spark.examples.JavaWordCount /home/hduser/spark-1.4.1-bin-hadoop2.6/lib/spark-examples-1.4.1-hadoop2.6.0.jar /home/hduser/spark-1.4.1-bin-hadoop2.6/README.md    

JAVA 8 FLATMAP EXAMPLE

JAVA 8 FLATMAP EXAMPLE

A developer knows a set of programming languages:

public class Developer {

    private String name;
    private Set languages;

    public Developer(String name) {
        this.languages = new HashSet<>();
        this.name = name;
    }

    public void add(String language) {
        this.languages.add(language);
    }

    public Set getLanguages() {
        return languages;
    }
}

A team has more than one developers. Now we would like to know the aggregate programming language skills for a given team. Stream#flatMap is perfect for "flattening" collections:

import java.util.ArrayList;
import java.util.List;
import java.util.stream.Collectors;
import static org.junit.Assert.assertTrue;
import org.junit.Test;

public class FlatMapTest {

    @Test
    public void flatMap() {
        List team = new ArrayList<>();
        Developer polyglot = new Developer("esoteric");
        polyglot.add("clojure");
        polyglot.add("scala");
        polyglot.add("groovy");
        polyglot.add("go");

        Developer busy = new Developer("pragmatic");
        busy.add("java");
        busy.add("javascript");

        team.add(polyglot);
        team.add(busy);

        List teamLanguages = team.stream().
                map(d -> d.getLanguages()).
                flatMap(l -> l.stream()).
                collect(Collectors.toList());
        assertTrue(teamLanguages.containsAll(polyglot.getLanguages()));
        assertTrue(teamLanguages.containsAll(busy.getLanguages()));
    }
}


Big picture on BigData


NYSE_CGW_FIX_GATEWAY_Specification_and_API

NYSE_CGW_FIX_GATEWAY - Java TransactTools ttConnect based FIX Gateway for NYSE Arca Exchange

What are closures ?


In a simplest sense, a Closure is a function instance,  just like class instance, with each function instance is instantiated with the specific value. In Python for example where everything is an object - modules, classes, functions, attributes etc., a closure is an object returned by a function which is instantiated with parameters it is invoked with. The object returned in this case is a  function  ( an inner function ) instance returned by another function (an outer function).

Now a more canonical explanation definition of closure.
"A closure is data attached to code " . A Closure is about  maintaining state of some kind in functions. 

Many functional languages ( Lisp, Haskel, Scala and other )  rely heavily on closures.  Python, support it, but it's not at the very core of the language. Python, however, is a fine language for getting to know closures. So, even if your expertise is in some other language, you might learn a thing or two here.

An Example

Now consider this :

def make_log(level):
    ''' a closure that manufactures different logs'''
    def _anon(message):
        print ("{}: {}".format(level, message))
    return _anon

if __name__ == '__main__' :
    main()
    print(main.__doc__)
    # log_info closure
    log_info = make_log("info")
    # log warn closure
    log_warn = make_log("warn")
    # log error closure
    log_error = make_log("error")
    
    log_info("loading configuration file")  #invoking log_info closure
    log_warn("config value missing. using default value") # invoking log_warn closure
    log_error("network socket error. can not connect to server") # invoking log_error closure


Spark - RDD how do they work ?




NYSE_CCG_FIX_Specification

NYSE_CCG_FIX_GATEWAY Specification - Java TransactTools ttConnect based FIX Gateway for NYSE Classic Exchange

NYSE_UGW_FIX_GATEWAY_Specification_and_API

NYSE_UGW_FIX_GATEWAY_Specification_and_API - C++ based Generation 2 Order Entry Gateway ( for NYSE Arca Exchange )

NYSE_Arca_Drop_Copy_FIX_Specification

NYSE UTP Direct_Specification

NYSE UTP Direct_Specification - cplusplus binary High speed Order Entry gateway for Arca Exchange