[mythtv-users] apache 2gb limit? mythweb

Steve Malenfant smalenfant at gmail.com
Thu Mar 17 22:33:17 UTC 2005


I've heard that Apache 2.1 will support LFS, not before.
I have apache 1.3 running on Sun Ultra5 which has LFS support.
I think that apache 1.3 can be compiled with LFS.
Athlon 64 might be able to use Apache 2.0 with LFS (not sure about
that, but would native 64 bit need LFS support???)

Steve

On Thu, 17 Mar 2005 14:28:49 -0800, M. Barnabas Luntzel
<mark at luntzel.com> wrote:
> I think the culprit is php. I didn't bother digging into it because I
> didn't care enough before. but since you asked...
> 
> is something like this in your error log?
> 
> [Thu Mar 17 14:22:31 2005] [error] [client 192.168.1.1] (75)Value too
> large for defined data type: access to
> /mythweb/video_dir/1501_20050313180000_20050313190000.nuv failed,
> referer: http://xxxx.xx:8080/mythweb/recorded_programs.php
> 
> bug seems to be kinda old:
> 
> http://bugs.php.net/bug.php?id=27792&edit=3
> 
> but I think thats what you're up against. may need to recompile php,
> but thats a pain if you're sticking to rpms like I am. I especially
> like this line:
> 
> [22 Oct 2004 1:53am CEST] edink at php.net
> This bug has been fixed in CVS.
> 
> I didn't dig too much further, there might be a similar bug posted that
> I didn't check for.
> 
> On Mar 17, 2005, at 1:31 PM, sschaefer1 at woh.rr.com wrote:
> 
> > If you are transmitting a large file then the browser you are
> > connecting to the webserver with will also need handle large files.
> > When I was coding my webserver I ran into this issue and found that
> > the problem usually lies on the browser side.
> >
> > ----- Original Message -----
> > From: Jeff Simpson <llcooljeff at gmail.com>
> > Date: Thursday, March 17, 2005 4:10 pm
> > Subject: [mythtv-users] apache 2gb limit? mythweb
> >
> >> Not directly myth related, but I'm sure that somebody else on this
> >> list has run into it before. I'd like to use mythweb as a way to get
> >> an http link to the raw NUV files (to use with wget or similar). When
> >> attempting to download files > 2gb in size, apache throws a strange
> >> error (I don't have the error handy, but it was clear that it was a
> >> file size issue). Is there a particular way that apache needs to be
> >> compiled so that large files will work? I'm running 2.0.52-rc1 on a
> >> gentoo system. (maybe a use flag that enables large file support?)
> >>
> >> --
> >> email me if you want a gmail invite, I have some
> >> _______________________________________________
> >> mythtv-users mailing list
> >> mythtv-users at mythtv.org
> >> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
> >>
> >
> > _______________________________________________
> > mythtv-users mailing list
> > mythtv-users at mythtv.org
> > http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
> 
> 
> _______________________________________________
> mythtv-users mailing list
> mythtv-users at mythtv.org
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
> 
> 
>


More information about the mythtv-users mailing list