[url]http://luabin.overvprojects.nl/?path=/lua/includes/extensions/table.lua[/url]
line 27
Noob UNIX user here. Working on a script for class.
I need to create a file with a name unique to the date (ex. 111320111030.rpt = 11/13/2011 10:30). Then I have to add data to that file which is usually no problem, except for the fact that the file has a name that changes every time you run the script.
Is there a way to do this other than creating a file with a static name then renaming it later?
[QUOTE=MTMod;33267760]Noob UNIX user here. Working on a script for class.
I need to create a file with a name unique to the date (ex. 111320111030.rpt = 11/13/2011 10:30). Then I have to add data to that file which is usually no problem, except for the fact that the file has a name that changes every time you run the script.
Is there a way to do this other than creating a file with a static name then renaming it later?[/QUOTE]
Symlink 111320111030.rpt -> current.rpt
In your script, when you generate the filename, put it in a shell variable, e.g. $filename. Then you can use the variable as part of whatever command writes data into the file.
We're going to dig up an ancient program boys - VB6.0 to be specific. I wish my school had something more recent but anyways:
I need a character set that is only 127 characters, the Chr function in VB is a 255 character set. Although I could change a few variables so my program would be compatible with an 8-bit character set, I'm too lazy to do so. So is there a way to use a 127 character set?
I know this is really simple but it's been bugging me for an hour or so, and search engines are useless in regard to my query.
What
Just
Use only ASCII characters?
Having trouble with tar in UNIX.
How can I get it to compress a single directory and not the subdirectories that are in it?
[QUOTE=MTMod;33273784]Having trouble with tar in UNIX.
How can I get it to compress a single directory and not the subdirectories that are in it?[/QUOTE]
Why would you do that, first of all?
[QUOTE=esalaka;33274612]Why would you do that, first of all?[/QUOTE]
Because the archive itself is in the subdirectory.
Well shit. You just made me realize that I misinterpreted the instructions. The archive directory shouldn't be a subdirectory. HERPDERP
Thanks!
I want to know how to download files using Lua and the Love engine. Google found me this code:
[lua]
local host = "www.w3.org"
local file = "/TR/REC-html32.html"
require("socket.http")
local c, count = nil, 0
function love.load()
c = assert(socket.connect(host, 80))
c:send("GET " .. file .. " HTTP/1.0\r\n\r\n")
while true do
local s, status = receive(c)
print(s,status)
count = count + string.len(s or "")
if status == "closed" then break end
end
c:close()
print(file, count)
end
function receive (connection)
local s, status = connection:receive(2^10)
if status == "timeout" then
coroutine.yield(connection)
end
return s, status
end
[/lua]
Getting the contents of that file ([url]www.w3.org/TR/REC-html32.html[/url]) with this code gives me no problems. Perfectly outputs the size of the file.
But when I change the file to, lets say a random file from my friend's site, (my site is currently down, sadface,) it instantly closes connection. Any ideas?
(The file I was trying to get was [url]www.cwan.nl/Content/GameIdea.txt[/url])
Yes, I splitted the path to
[lua]
local host = "www.cwan.nl"
local file = "/Content/GameIdea.txt"
[/lua]
[url]http://pastebin.com/fAwQuQjx[/url]
It would open and close immediately in less than a second. I've just realized that it's due to the failure of loading an image, how would I properly load them?
[QUOTE=Darkwater124;33275475]I want to know how to download files using Lua and the Love engine. Google found me this code:
[lua]
local host = "www.w3.org"
local file = "/TR/REC-html32.html"
require("socket.http")
local c, count = nil, 0
function love.load()
c = assert(socket.connect(host, 80))
c:send("GET " .. file .. " HTTP/1.0\r\n\r\n")
while true do
local s, status = receive(c)
print(s,status)
count = count + string.len(s or "")
if status == "closed" then break end
end
c:close()
print(file, count)
end
function receive (connection)
local s, status = connection:receive(2^10)
if status == "timeout" then
coroutine.yield(connection)
end
return s, status
end
[/lua]
Getting the contents of that file ([url]www.w3.org/TR/REC-html32.html[/url]) with this code gives me no problems. Perfectly outputs the size of the file.
But when I change the file to, lets say a random file from my friend's site, (my site is currently down, sadface,) it instantly closes connection. Any ideas?
(The file I was trying to get was [url]www.cwan.nl/Content/GameIdea.txt[/url])
Yes, I splitted the path to
[lua]
local host = "www.cwan.nl"
local file = "/Content/GameIdea.txt"
[/lua][/QUOTE]
I recommend you change your request to use HTTP/1.1 and add a Host parameter. The reason it is not working on your site is the overwhelmingly likely fact that your webhost is providing multiple sites on a single computer using virtual hosts. As the web server can't tell which website you're actually asking for, it balks and bails out. Your properly formatted request should now look like:
[code]
GET {file} HTTP/1.1
Host: {host}
[/code]
[QUOTE=CoolKingKaso;33277749][url]http://pastebin.com/fAwQuQjx[/url]
It would open and close immediately in less than a second. I've just realized that it's due to the failure of loading an image, how would I properly load them?[/QUOTE]
The code is fine. Are you sure the image is in the same directory as the executable?
[QUOTE=Niteshifter;33278200]The code is fine. Are you sure the image is in the same directory as the executable?[/QUOTE]
It's in there, I'm not really sure why it isn't detecting it.
[t]http://i.imgur.com/LmXrh.png[/t]
Is the file extension capital?
It isn't.
Try initialising 'quit' to false.
I have the worst luck ever, it still did the thing it previously did to me.
Incoming functional programming (Haskell)!
I have this little piece of code to draw a simple fractal, and I'm getting a parse error, but I can't see the issue:
[code]cooly :: Int -> Command
cooly x = p :#: f x
where
f 0 = Go 10
f (x+1) = f x :#: p :#: f x :#: n :#: f x :#: n :#: f x :#: p :#: f x
p = Turn (-90)
n = Turn 90[/code]
I get a parse error on the line beginning "f (x+1)". Strange.
[QUOTE=CoolKingKaso;33279281]I have the worst luck ever, it still did the thing it previously did to me.[/QUOTE]
Try [url=http://pastebin.com/CVZ0jX9z]this[/url]. It creates a basic window with the sprite, but there's a couple smaller details that are different from yours.
[QUOTE=nos217;33279784]I get a parse error on the line beginning "f (x+1)". Strange.[/QUOTE]
You can't pattern-match with (x+1). Try defining it as "f x" and using "f (x-1)" in the definition.
[editline]14th November 2011[/editline]
Actually, apparently it [url=https://sites.google.com/site/haskell/notes/nkpatterns]used to be allowed[/url], but was [url=http://hackage.haskell.org/trac/haskell-prime/wiki/RemoveNPlusK]removed[/url] in Haskell 2010 because it complicates the parser for no real benefit.
[QUOTE=mechanarchy;33278124]I recommend you change your request to use HTTP/1.1 and add a Host parameter. The reason it is not working on your site is the overwhelmingly likely fact that your webhost is providing multiple sites on a single computer using virtual hosts. As the web server can't tell which website you're actually asking for, it balks and bails out. Your properly formatted request should now look like:
[code]
GET {file} HTTP/1.1
Host: {host}
[/code][/QUOTE]
Thanks, Ill try this later when I get home.
Also, I found a -working- script to unzip a zip file, but Ive got one problem: I can only use an absolute filepath. Does anyone know how I can grab a file from the folder in Appdata, the only folder where Love can save files?
[QUOTE=Wyzard;33280563]You can't pattern-match with (x+1). Try defining it as "f x" and using "f (x-1)" in the definition.
[editline]14th November 2011[/editline]
Actually, apparently it [url=https://sites.google.com/site/haskell/notes/nkpatterns]used to be allowed[/url], but was [url=http://hackage.haskell.org/trac/haskell-prime/wiki/RemoveNPlusK]removed[/url] in Haskell 2010 because it complicates the parser for no real benefit.[/QUOTE]
Interesting you should say that, our lecturer is [url=http://homepages.inf.ed.ac.uk/wadler/]this guy[/url] and he was one of the main contributors to the language. He taught us to use "f (x+1)" for pattern matching in recursive functions because he finds it looks more intuitive. I'll need to bring this up with him if it's removed. He'll be pissed.
Ah, he brought it up in today's lecture. There's a flag you can pass to make it accept those kinds of patterns.
[QUOTE=Darkwater124;33283407][QUOTE=mechanarchy;33278124]I recommend you change your request to use HTTP/1.1 and add a Host parameter. The reason it is not working on your site is the overwhelmingly likely fact that your webhost is providing multiple sites on a single computer using virtual hosts. As the web server can't tell which website you're actually asking for, it balks and bails out. Your properly formatted request should now look like:
[code]
GET {file} HTTP/1.1
Host: {host}
[/code][/QUOTE]
Thanks, Ill try this later when I get home.
Also, I found a -working- script to unzip a zip file, but Ive got one problem: I can only use an absolute filepath. Does anyone know how I can grab a file from the folder in Appdata, the only folder where Love can save files?[/QUOTE]
Im now using this as a request now:
"GET "..file.." HTTP/1.1\r\nHost: "..host.."\r\n\r\n"
Which would evaluate to:
"GET /data/test.txt HTTP/1.1\r\nHost: www.novaember.com\r\n\r\n"
(Yes, my own site is back up)
It freezes for exactly 5 seconds and closes again without returning anything. Did I do something wrong?
[QUOTE=nos217;33284205]There's a flag you can pass to make it accept those kinds of patterns.[/QUOTE]
I'm not surprised, since an implementation that supports Haskell 98 will have a parser that can handle n+k patterns anyway. But from a Haskell 2010 standpoint, you're basically relying on a nonstandard compiler extension.
[QUOTE=Darkwater124;33286518]Im now using this as a request now:
"GET "..file.." HTTP/1.1\r\nHost: "..host.."\r\n\r\n"
Which would evaluate to:
"GET /data/test.txt HTTP/1.1\r\nHost: www.novaember.com\r\n\r\n"
(Yes, my own site is back up)
It freezes for exactly 5 seconds and closes again without returning anything. Did I do something wrong?[/QUOTE]
Through experimentation with telnet, your server refuses my connection unless I add the request header
[code]
Accept: text/html
[/code]
That would make the entire request
[code]
GET /data/test.txt HTTP/1.1
Host: www.novaember.com
Accept: text/html
[/code]
Is Accept really needed for HTTP 1.1?
[editline]15th November 2011[/editline]
I mean, should it always be needed?
[QUOTE=esalaka;33288177]Is Accept really needed for HTTP 1.1?
[editline]15th November 2011[/editline]
I mean, should it always be needed?[/QUOTE]
I'm not sure, but in this case [url]www.novaember.com[/url] was rejecting my connection unless I provided the Accept field.
Okay, urgent question:
What algorithms should I look into to find the shortest path between two nodes in a graph based on [I]two[/I] path weights? Simplifying factor here is: One path cost is a primary/critical one - i.e. it [I]must[/I] be minimized. The secondary one must be minimized if several paths are found to have the same minimum total primary cost.
I realize I can just run any shortest path algorithm based on the primary cost, once to every single destination node, store all the minimum cost paths matching the minimum primary cost, and finally select the one with the minimum secondary cost. But that hardly seems very memory or speed efficient. Are there any specific algorithms or techniques suited for this?
Seems to me you'd be better off summing the primary and secondary weights in some way, but if you absolutely must minimize the primary one, then I'd just do that and then, in the case of a tie, sum and then compare the secondary costs of the two paths.
[editline]15th November 2011[/editline]
Seems like it wouldn't have much of an effect though; I imagine it's somewhat rare to get path cost ties.
Sorry, you need to Log In to post a reply to this thread.