[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Orekit Developers] [SOCIS 2011] New snapshot with event detection



Hi,

So it has been nearly one week since the last snapshot. Event
detection was in fact far more hard to implement that I originally
imagine, so I have a lot of things to tell and I'll try to be concise
:) This snapshot should support it.

Firstly, here is the snapshot URL :
https://www.orekit.org/forge/attachments/download/59/orekit-20110817.apk

Follow the same instructions that for the older snapshots if you want
to install it.

Some details about this snapshot :

* I originally didn't thought that the hardest task of this part would
be just dedicated to drawing a table. On Android, there is no widget
to draw a table. You have "gridview" which has nothing to do with our
needs, and TableLayout which 1. doesn't draw borders 2. doesn't manage
scrollbars

So I ended up writing a widget inheriting TableLayout and using a
well-used hack to draw the borders which is to play with
padding/margins to show a black border (I would be happy to use a more
cleaner method. I managed to have one, but when you add a border to a
cell, you ended up having *two* borders when you have two cells aside,
which is not pretty at all).

After this, I wanted to add scrollbars because there will be some data
here and it may not fit on the screen. On Android there is two
ScrollView : there is HorizontalScrollView and ScrollView for
vertical. BUT if you put an HorizontalScrollView in a ScrollView, the
one on the top absorbs the touch event and you will not be able to
scroll. It's a well-known bug and the official response is "write your
own ScrollView". Which isn't a valid answer because if you take
ScrollView.java and put it into your code, you will see that it
modifies private/protected fields you don't have access to ! (maybe
there is a way, but still, at that moment I was a little bit tired of
already passing 1 day 1/2 on this)

In fact I didn't do that exactly, but TableView (which is our own
widget) embeds some code in ScrollView and re-implements scrolling by
using more low-level helpers given by the basic View class (where all
widgets inherits. btw, on Android, a "widget" you would have on Qt or
other is called a View).

The fun part is that it uses very poorly documented code (essentially
scrollbar initialization) where you only find "Go read the javadocs,
everything is documented !" posts by official Android developer
advocates (the part where on Android there is official API and
internal API made it even worse because I couldn't find how ScrollView
initializes its scrollbar in Android source code, I still don't know
what magic it uses to initialize that, maybe I haven't look enough). I
still managed to do this after some pointers given in a StackOverflow
post, but this whole thing took me two to three days.

Maybe I just didn't saw some easy and nice solution which was just
here, or maybe a better skilled Android developer than me might find
this instantly, but I currently don't know how to draw a table with
borders including horizontal + vertical without doing this.

This widget could be better (for instance adding velocity tracking for
scrolling), but it works :) And it's cleaner than generating HTML code
on the fly and using a webkit widget :)

* The bug where some frames weren't loading because Eclipse doesn't
copy META-INF files from Orekit jar to the apk is still ongoing, but
it was blocking me for using visibility detection from a ground
station (as the way described in the tutorial Java files were needing
these data files). So I made a little workaround to be able to
continue, but I still need to fix this.

This workaround was to copy these data files to the assets/ folder of
the project and to modify Orekit to look inside /assets/ instead of
/META-INF/. This is very crappy, but it's a temporary solution.

I've looked everywhere to figure out if it's not possible to force
Eclipse to embed this META-INF file, but the only way I found was to
add a custom Ant building script which would call the "aapt" tool
after building to tell him to add the META-INF directory to the final
apk. I didn't do it because I never wrote ant scripts, and I still
want to try finding better ways to do that.

* Just by the way, there is how Event detection works on the Android
application. I wanted to make it as extensible as possible to enable
you to add events pretty easily.

  1. When you press "add event", it will tell to the Android system to
search for an activity which can respond to the Intent
"org.orekit.android.selector.EVENT". If you have multiple event
detectors, you will have a list of the different events available (now
it's only visibility detection so you will see nothing). You should
also be able to have event detectors declared by applications outside
Orekit (which will act as plugins).
  2. The event activity will start, can request parameters but it's
not mandatory, and should return an instance of EventProxy which is an
abstract class. As the visibility detection event doesn't need any
parameters in this snapshot, the window will be closed before you'll
see it.
  3. This list of EventProxy instances goes to the computation window
  4. The method instance.isNeedingStations() is called and returns
true if this event type requires "ground stations" information
  5. The method instance.load(StationProxy station, EventLog eventlog)
is called, for each station if it requires station info, or one time
with station = null otherwise. It returns an EventDetector instance.
  6. The eventlog parameter is an EventLog instance, which is here to
enable passing back the data back to the UI, it will be described just
below.

EventLog is very simple to use, you have a write method to write a
line in the table. It works that way eventlog.write("Visibility
Detection", new String[]{"key1", "value1", "key2", "value2"}); The
first argument is a tag to tell the user from which event detector the
event comes from, the second argument is an array where the even items
are keys, and the odd ones are values. Keys are column names.

It has been written with the fact that columns might be shared between
event detectors in mind. By the way when you add columns A, B and then
C, B, it will not write A, B, C which may not respect the way the
author wants to display data, but it will write A, C, B (it inserts C
between the existing A and B instead of inserting it at the end). This
is not done when the write() method is called, but when the
renderTable() is called, which will return a String[][] instance
you'll feed to TableView.setTable().

By the way, the algorithm for merging columns is pretty naive (a very
sketchy complexity would be O(|lines| * |columns|^4) but I don't know
if it's even reachable), but this wouldn't be a problem for us as we
shouldn't have a lot of columns.

* By the way, when you configure a station, current coarse location
fetching doesn't work on the emulator.

* Event detection has a HUGE problem, which is I'm using ListView to
show the list of events and the list of stations, but when there are
too many stations, you'll have a scroll bar on this ListView. But when
want to be able to deal with small phone screens, I need to put a
ScrollView at the root of the window so you can scroll down to see the
end of the form (already done on Frame form or Impulse maneuver form).
But doing so on this form triggers the bug I told on the beginning of
this mail : the top ScrollView will "absorbs" the touch event and the
ListView will fail scrolling.

Currently, on a phone you have so much info vertically that you can't
use the form vertically : the listview doesn't have the room to show 1
element, and the "+" button doesn't have the room to show its content.

There is two solutions for that :
  1. Reduce the amount of data shown (for instance moving the
"Stations" list to the Visibility detection plugin, even if it may
mean entering station data multiple times if you have plugins which
require )
  2. Move it to a wizard (if I figure out how to do that)
  3. Make our own view which inherits ScrollView to add a protect(View
v) method which measure the position/size of the View v, and will
transmit the TouchEvent to this View v if this event is in this view
instead of absorbing it.

* Also, if you want some benchmarks about event detection on my Nexus
One (1GHz CPU, Android 2.3 with JIT enabled, with a cold Orekit
cache). 56 seconds to load data, 16 seconds to run the simulation.
I've set maxCheck to 5 sec hoping to improve that, but is it ok ?
Should I make it a setting ?

* All the UI changes we discussed are not included, because I wanted
to focus on finishing the features before attacking UI polishing :)

By the way, as there is only visibility detection right now, what
would you want me to implement as an event detector in order to try ?
:)

Have a nice day, and sorry for having wrote such a long text.

If you have any questions, or any suggestions, feel free to ask :)

Alexis Robert