- Making my Markdown Format package for Atom have zero external dependencies by compiling the Go code into JavaScript. That way there's no need to install and point to the markdownfmt binary, making install easier. https://github.com/shurcooL/atom-markdown-format/commit/6b5f...
- Replacing JavaScript with Go code in Go Package Store, still a WIP PR because I need to figure out a `go get` difficulty, but aside from that it works perfectly well. https://github.com/shurcooL/Go-Package-Store/pull/18
- At work, I've created a small standalone project with a frontend UI that reused some common structs and html template code between the frontend and backend (ala node.js, except all in Go).
- Working on porting my small OpenGL game written in Go to run in the browser. The goal is to have the same Go code use OpenGL when on desktop, but use GopherJS and WebGL when compiling for the browser.
Basically, it lets me do all the things that one would normally be forced to use JavaScript for (i.e., frontend code) but without having to use JavaScript (I'm not good at it and I don't like it), while benefitting from Go compiler to catch errors, autocomplete, godocs, goimports, etc. It also allows me to import and use existing Go packages that perform non-trivial tasks.
Does it add type checks into compiled code to emulate strict typing?
Types of variables in the source code we can check by go itself, but there are server responses (JSON) where we don't have checked types. Really interesting.
I'd like to point out that goroutine support has been added which allows concurrency in the browser. You can literally avoid using callback soup using this.
You can use it without going through Go, and have proper support for normal JavaScript. The concepts are explained here: http://jlongster.com/Taming-the-Asynchronous-Beast-with-CSP-... (EDIT: I see now you are asking why go's channels are more powerful than core.async's)
Can anyone speak to the trade-off in file size, for non-trivial codebases? There is a stated design goal of "small size of generated code", but the default snippet in the playground introduces quite a large amount of boilerplate. Is the penalty on file size fairly front-loaded or does the generated code scale horribly?
Annoyingly but predictably, I don't really have any nontrivial code of my own around that doesn't violate any of the (entirely reasonable, except for x509) constraints.
GopherJS happily compiles itself, however. The raw source code looks to be about 250KB. The native binary ends up 7.7MB, and the unminified js is 3MB. A pass through UglifyJS with default options reduces that to 1.7MB (I have not tested the result):
~/gocode/bin$ ls -lh *gopher*
-rwxr-xr-x 1 nknight staff 7.7M Oct 14 14:02 gopherjs
-rw-r--r-- 1 nknight staff 3.0M Oct 14 14:32 gopherjs.js
-rw-r--r-- 1 nknight staff 233K Oct 14 14:32 gopherjs.js.map
-rw-r--r-- 1 nknight staff 1.7M Oct 14 14:41 min-gopherjs.js
~/gocode/bin$
Edit: Oh, and the playground example minifies to 300KB:
~/gjstest/playground$ ls -lh
total 1784
-rw-r--r-- 1 nknight staff 194B Oct 14 14:05 main.go
-rw-r--r-- 1 nknight staff 536K Oct 14 14:53 main.js
-rw-r--r-- 1 nknight staff 42K Oct 14 14:53 main.js.map
-rw-r--r-- 1 nknight staff 300K Oct 14 14:54 min-main.js
~/gjstest/playground$
The thing is that if you want to compile a single fmt.Println("Hello") line, that will produce a large output because of what's needed to support it, including dealing with Go/JS types like bytes, runes, slices, standard library ("fmt" and things it imports), etc.
That part is constant. Once you write more code, the generated output will grow very gracefully as expected.
The output is quite large (order of 1~2 MB), even when minified, so it might not work well yet for user facing sites with huge traffic. That could be an ongoing improvement effort for GopherJS to benefit from. However, for smaller sites or offline utilities it works absolutely great.
The original output is very large but when gzipped it's surprisingly small. For example an app in my framework that gopherjs produces is like 1.2MB minified, but the gzipped one is just over 200kb, so no problem.
Download size isn't the only issue. Larger scripts are slower to parse and startup, slow down debugging tools, inhibit JS VM optimizations and use up more heap in general.
This may not matter on desktop intranets, but it will matter on the mobile web.
Is there any hope for a workflow similar to clojurescripts? It's very nice to have your page hooked to the repl and be able to re-evaluate pieces of code then have the page functionality change without constant refreshes and recompiles.
Edit: I want to expand on this a bit. I'm curious about how the workflow is now and what's envisioned. Things like the above mentioned and also LightTables insta-repl make for great workflow value propositions. I very much like Clojure, but I like Go as well and have written much, much more of it. So this is very interesting, but.. Do I need to write the Go code blind, then compile to js, then refresh and test my page to see what's going on?
There is no repl, but there are projects in development that make the workflow less of a chore. My own project, SRVi (https://github.com/ajhager/srvi), serves up your GopherJS project, recompiling and embedding when you refresh the page. It will even report compiler errors right in the browser.
Still plenty of room for improvement, but there are people who are excited about making GopherJS fun to work with.
I've written convenience http handlers that compile Go to js on demand (and report errors in the browser console & server stderr). I do something like this in the backend:
That way, I can edit the ./assets/script.go file in my code editor (with goimports on save, autocomplete, etc.) with the server running. Whenever I refresh the page in the browser, new compiled JavaScript is loaded.
It's really nice to be able edit the Go script, css, html template and not need to restart the web server for effects to take place. Just refresh the page, and see new code running.
That's my workflow so far.
I have plans to improve that code so that it accesses the assets via a virtual file system interface, so that it's possible to have a debug version that hot-reloads all files from disk, and a release version that embeds the assets into the Go binary so it can run anywhere.
Very interesting, so It's sort of a repl server. Using node.js you could even execute and return the results.. I'm starting to see how someone(who knows what they are doing) could create a POC LightTable plugin...
Anyone know how hard it would be to get a ReactJS binding setup for this? I was literally just looking into GopherJS (for a normal frontend stuff) - but i'm unsure how gnarly it would be to setup bindings.
Part of me fears the output though. It can be rather verbose, large, and is quite abstract from the original code. Debugging and code-size worry me greatly.
I'm curious about this too. I've heard good things about ReactJS. There are bindings for AngularJS for GopherJS, so you might want to look at that. I'm guessing it should be doable.
I would say that's a great work. But, in order to write efficient code, you still need to understand both languages. It just saves you time to remember JavaScript syntax. Since we have google and stackoverflow, people can pretty much write more idiomatic code than machine. So the conclusion is not worth it.
Why use a server-side language developed by Google (Go) as an alternative to JavaScript, when you could use a client-side language developed by Google and intended as an alternative to JavaScript (Dart)?
As of this month, GWT has 150,000 monthly active developers, and is deployed to 20,000 unique domains that we can count (we can't count intranets) User base has tripled since 2009 and held relatively constant.
http://en.wikipedia.org/wiki/Gopher_(protocol)
Time to pour one out for protocols loved and lost...