Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Insert into remote host error #94

Open
badalb opened this issue Jan 29, 2016 · 4 comments
Open

Insert into remote host error #94

badalb opened this issue Jan 29, 2016 · 4 comments

Comments

@badalb
Copy link

badalb commented Jan 29, 2016

*Seems inserting into remote host is not working properly - *
My client code : -
String host = "<remote_host_ip>";

    DDataSource.adjustPoolSettings(100, 50, 50, 50);

    List<String> inserts = ImmutableList
            .of("INSERT INTO abc (timestamp , provider , uuid, DOUBLE_SUM(click) AS click) "
                    + "  VALUES ('2014-10-31 00:00:00','sri','fsd-sdf-dfgdf','2') WHERE"
                    + " interval BETWEEN '2014-10-30' AND '2014-11-01'  BREAK BY 'day'",
                    "INSERT INTO abc (timestamp , provider , uuid, DOUBLE_SUM(click) AS click)  "
                            + "VALUES ('2014-11-01 00:00:00','siri','abd-dfgdf-32de1','5') WHERE"
                            + " interval BETWEEN '2014-11-01' AND '2014-11-02'  BREAK BY 'day'",
                    "INSERT INTO abc (timestamp , provider , uuid, DOUBLE_SUM(click) AS click)"
                            + "  VALUES ('2014-11-02 00:00:00','preeti','dfgdf-fsd-sdf','8') WHERE"
                            + " interval BETWEEN '2014-11-02' AND '2014-11-03'  BREAK BY 'day'",
                    "INSERT INTO abc (timestamp , provider , uuid, DOUBLE_SUM(click) AS click)"
                            + "  VALUES ('2014-11-03 00:00:00','aditya','fsd-sdf-dfgdf','1') WHERE"
                            + " interval BETWEEN '2014-11-03' AND '2014-11-04'  BREAK BY 'day'",
                    "INSERT INTO abc (timestamp , provider , uuid, DOUBLE_SUM(click) AS click)"
                            + "  VALUES ('2014-11-04 00:00:00','adi','fsd-sdf-dfgdf','9999') WHERE"
                            + " interval BETWEEN '2014-11-04' AND '2014-11-05'  BREAK BY 'day'",
                    "INSERT INTO abc (timestamp , provider , uuid, DOUBLE_SUM(click) AS click) "
                            + " VALUES ('2014-11-05 00:00:00','antham','fsd-sdf-dfgdf','99999')"
                            + " WHERE interval BETWEEN '2014-11-05' AND '2014-11-06'  BREAK BY 'day'");
    //DDataSource driver = new DDataSource("localhost", 8080, "localhost",
        //  8082, "localhost", 8087, "localhost", 3306, "root", "password",
            //"druid");
    DDataSource driver = new DDataSource(host, 8082, host, 8081, host,
            8190, host, 3306, "root", "password", "druid");

    for (String insert : inserts) {
        driver.query(insert, null, null, true, "sql");
    }

BasicInsertMeta.java

dataPath is always null. So creating the csv file in client local system.

public Map<String, Object> getFirehose() {
Map<String, Object> finalMap = new LinkedHashMap<>();
finalMap.put("type", "local");
if (dataPath != null) {
int folderEndIndex = dataPath.lastIndexOf("/");
finalMap.put("baseDir", dataPath.substring(0, folderEndIndex + 1));
finalMap.put("filter", (folderEndIndex == dataPath.length() - 1)?"*":dataPath.substring(folderEndIndex + 1));
if (dataPath.endsWith("json")) {
dataFormat = "json";
} else if (dataPath.endsWith("csv")) {
dataFormat = "csv";
}
} else {
finalMap.put("baseDir", tmpFolder);
String fileName = UUID.randomUUID().toString() + ".csv";
finalMap.put("filter", fileName);
dataFormat = "csv";
if (values.isEmpty()) {
throw new IllegalStateException("No values to insert !!");
}
try {
File file = new File(tmpFolder + File.separator + fileName);
FileUtils.write(file, Joiner.on(",").join(values));
System.out.println("Written to " + file);
Object timestamp = values.get(0);
timestampFormat = TimeUtils.detectFormat(timestamp.toString());
} catch (IOException ex) {
Logger.getLogger(BasicInsertMeta.class.getName()).log(Level.SEVERE, null, ex);
}

    }
    return finalMap;
}

Error log:-
Coordinator console error log -> looking for csv in remote server. Although its written in client
2016-01-29T06:15:19,189 INFO [task-runner-0] io.druid.segment.realtime.firehose.LocalFirehoseFactory - Searching for all [6e06a2e0-ae0e-479e-ad38-60769f27802f.csv] in and beneath [/var/folders/_y/hxlk4pz16qj2lsp6gs_0yjxnbt3pf8/T]
2016-01-29T06:15:19,197 ERROR [task-runner-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Exception while running task[IndexTask{id=index_abc_2016-01-29T06:15:11.269Z, type=index, dataSource=abc}]
java.lang.IllegalArgumentException: Parameter 'directory' is not a directory
at org.apache.commons.io.FileUtils.listFiles(FileUtils.java:358) ~[commons-io-2.0.1.jar:2.0.1]
at io.druid.segment.realtime.firehose.LocalFirehoseFactory.connect(LocalFirehoseFactory.java:87) ~[druid-server-0.8.2.jar:0.8.2]
at io.druid.segment.realtime.firehose.LocalFirehoseFactory.connect(LocalFirehoseFactory.java:43) ~[druid-server-0.8.2.jar:0.8.2]
at io.druid.indexing.common.task.IndexTask.getDataIntervals(IndexTask.java:234) ~[druid-indexing-service-0.8.2.jar:0.8.2]
at io.druid.indexing.common.task.IndexTask.run(IndexTask.java:192) ~[druid-indexing-service-0.8.2.jar:0.8.2]
at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:221) [druid-indexing-service-0.8.2.jar:0.8.2]
at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:200) [druid-indexing-service-0.8.2.jar:0.8.2]
at java.util.concurrent.FutureTask.run(FutureTask.java:262) [?:1.7.0_67]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [?:1.7.0_67]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [?:1.7.0_67]
at java.lang.Thread.run(Thread.java:745) [?:1.7.0_67]
2016-01-29T06:15:19,202 INFO [task-runner-0] io.druid.indexing.worker.executor.ExecutorLifecycle - Task completed with status: {
"id" : "index_abc_2016-01-29T06:15:11.269Z",
"status" : "FAILED",
"duration" : 65
}

druidGParser.java line 1142 always returning 88 for query like above forcing to switch statement case(1) breaking the loop.

@srikalyc
Copy link
Owner

Using insert as
INSERT INTO table VALUES ('') works only with LOCAL. If you need remote
host to work use INSERT_HADOOP . However you use INSERT INTO .. FILE but
that file should exist in the remote host (If you have NFS much better).

Thanks
kalyan

On Fri, Jan 29, 2016 at 2:04 AM, Badal Baidya [email protected]
wrote:

druidGParser.java line 1142 always returning 88 for query like above
forcing to switch statement case(1) breaking the loop.


Reply to this email directly or view it on GitHub
#94 (comment).

Bye & regards
C.Srikalyan

@badalb
Copy link
Author

badalb commented Feb 1, 2016

Hi,
Thanks a lot for your response. Do you have any sample code example to demonstrate INSERT_HADOOP ? Could you please help.

Regards,
Badal

@srikalyc
Copy link
Owner

srikalyc commented Feb 1, 2016

Badal, If you see https://github.com/srikalyc/Sql4D/ and scroll down a bit,
you will find a table with bunch of links to blogs. Hope this helps.

Thanks
kalyan

On Sun, Jan 31, 2016 at 10:36 PM, Badal Baidya [email protected]
wrote:

Hi,
Thanks a lot for your response. Do you have any sample code example to
demonstrate INSERT_HADOOP ? Could you please help.

Regards,
Badal


Reply to this email directly or view it on GitHub
#94 (comment).

Bye & regards
C.Srikalyan

@badalb
Copy link
Author

badalb commented Feb 2, 2016

Hi Kalyan,

Got it. Thanks a lot.

Regards,
Badal

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants